scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 2017"


Journal ArticleDOI
TL;DR: The authors examined the role of uncertainty shocks in a one-sector, representative-agent dynamic stochastic general-equilibrium model and found that increased uncertainty about the future may indeed have played a signicant role in worsening the Great Recession.
Abstract: This paper examines the role of uncertainty shocks in a one-sector, representative-agent dynamic stochastic general-equilibrium model. When prices are exible, uncertainty shocks are not capable of producing business-cycle comovements among key macro variables. With countercyclical markups through sticky prices, however, uncertainty shocks can generate uctuations that are consistent with business cycles. Monetary policy usually plays a key role in osetting the negative impact of uncertainty shocks. If the central bank is constrained by the zero lower bound, then monetary policy can no longer perform its usual stabilizing function and higher uncertainty has even more negative eects on the economy. Calibrating the size of uncertainty shocks using uctuations in the VIX, we nd that increased uncertainty about the future may indeed have played a signicant role in worsening the Great Recession, which is consistent with statements by policymakers, economists, and the nancial press.

379 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a model of the relationship between real interest rates, credit spreads, and the structure and risk of the banking system, and characterize the equilibrium for a fixed aggregate supply of savings, showing that safer entrepreneurs will be funded by nonmonitoring banks and riskier entrepreneurs by monitoring banks.
Abstract: We present a model of the relationship between real interest rates, credit spreads, and the structure and risk of the banking system. Banks intermediate between entrepreneurs and investors, and can monitor entrepreneurs' projects. We characterize the equilibrium for a fixed aggregate supply of savings, showing that safer entrepreneurs will be funded by nonmonitoring banks and riskier entrepreneurs by monitoring banks. We show that an increase in savings reduces interest rates and spreads, and increases the relative size of the nonmonitoring banking system and the probability of failure of monitoring banks. We also show that the dynamic version of the model exhibits endogenous boom and bust cycles, and rationalizes the existence of countercyclical risk premia and the connection between low interest rates, tight credit spreads, and the buildup of risks during booms.

238 citations


Journal ArticleDOI
TL;DR: This article examined the long-term impact of state centralization on cultural norms in the Kuba Kingdom of Central Africa, and found that the norms of rule following and a greater propensity to cheat for material gain were associated with the effectiveness of formal institutions that enforce socially desirable behavior.
Abstract: We use variation in historical state centralization to examine the long-term impact of institutions on cultural norms. The Kuba Kingdom, established in Central Africa in the early 17th century by King Shyaam, had more developed state institutions than the other independent villages and chieftaincies in the region. It had an unwritten constitution, separation of political powers, a judicial system with courts and juries, a police force, a military, taxation, and significant public goods provision. Comparing individuals from the Kuba Kingdom to those from just outside the Kingdom, we find that centralized formal institutions are associated with weaker norms of rule following and a greater propensity to cheat for material gain. This finding is consistent with recent models where endogenous investments to inculcate values in children decline when there is an increase in the effectiveness of formal institutions that enforce socially desirable behavior. Consistent with such a mechanism, we find that Kuba parents believe it is less important to teach children values related to rule-following behaviors.

212 citations


Journal ArticleDOI
TL;DR: Kolotilin and Zapechelnyuk as discussed by the authors acknowledge financial support from the Australian Research Council and the Economic and Social Research Council (grant no. ES/N01829X/1).
Abstract: Kolotilin acknowledges financial support from the Australian Research Council. Zapechelnyuk acknowledges financial support from the Economic and Social Research Council (grant no. ES/N01829X/1)

189 citations


Journal ArticleDOI
TL;DR: In this article, the authors test and reject the hypothesis that the price increases can be explained by movement from one Nash-Bertrand equilibrium to another, and show that after the consummation of the MillerCoors joint venture, prices after the joint venture are 6% to 8% higher than they would have been with Nash-bertrand competition, and markups are 17% to 18% higher.
Abstract: We document abrupt increases in retail beer prices just after the consummation of the MillerCoors joint venture, both for MillerCoors and its major competitor, Anheuser‐Busch. Within the context of a differentiated‐products pricing model, we test and reject the hypothesis that the price increases can be explained by movement from one Nash–Bertrand equilibrium to another. Counterfactual simulations imply that prices after the joint venture are 6%–8% higher than they would have been with Nash–Bertrand competition, and that markups are 17%–18% higher. We relate the results to documentary evidence that the joint venture may have facilitated price coordination.

161 citations


ReportDOI
TL;DR: In this article, the authors provide efficient estimators and honest confidence bands for a variety of treatment effects including local average (LATE) and local quantile treatment effects (LQTE) in data-rich environments.
Abstract: In this paper, we provide efficient estimators and honest confidence bands for a variety of treatment effects including local average (LATE) and local quantile treatment effects (LQTE) in data-rich environments. We can handle very many control variables, endogenous receipt of treatment, heterogeneous treatment effects, and function-valued outcomes. Our framework covers the special case of exogenous receipt of treatment, either conditional on controls or unconditionally as in randomized control trials. In the latter case, our approach produces efficient estimators and honest bands for (functional) average treatment effects (ATE) and quantile treatment effects (QTE). To make informative inference possible, we assume that key reduced-form predictive relationships are approximately sparse. This assumption allows the use of regularization and selection methods to estimate those relations, and we provide methods for post-regularization and post-selection inference that are uniformly valid (honest) across a wide range of models. We show that a key ingredient enabling honest inference is the use of orthogonal or doubly robust moment conditions in estimating certain reduced-form functional parameters. We illustrate the use of the proposed methods with an application to estimating the effect of 401(k) eligibility and participation on accumulated assets. The results on program evaluation are obtained as a consequence of more general results on honest inference in a general moment-condition framework, which arises from structural equation models in econometrics. Here, too, the crucial ingredient is the use of orthogonal moment conditions, which can be constructed from the initial moment conditions. We provide results on honest inference for (function-valued) parameters within this general framework where any high-quality, machine learning methods (e.g., boosted trees, deep neural networks, random forest, and their aggregated and hybrid versions) can be used to learn the nonparametric/high-dimensional components of the model. These include a number of supporting auxiliary results that are of major independent interest: namely, we (1) prove uniform validity of a multiplier bootstrap, (2) offer a uniformly valid functional delta method, and (3) provide results for sparsity-based estimation of regression functions for function-valued outcomes.

159 citations


Journal ArticleDOI
TL;DR: It is shown that the optimum for the principal is simply to screen along each component separately, which does not require any assumptions on the structure of preferences within each component.
Abstract: A principal wishes to screen an agent along several dimensions of private information simultaneously. The agent has quasilinear preferences that are additively separable across the various components. We consider a robust version of the principal's problem, in which she knows the marginal distribution of each component of the agent's type, but does not know the joint distribution. Any mechanism is evaluated by its worst-case expected profit, over all joint distributions consistent with the known marginals. We show that the optimum for the principal is simply to screen along each component separately. This result does not require any assumptions (such as single crossing) on the structure of preferences within each component. The proof technique involves a generalization of the concept of virtual values to arbitrary screening problems. Sample applications include monopoly pricing and a stylized dynamic taxation model.

150 citations


Journal ArticleDOI
TL;DR: In this paper, the authors characterize optimal mechanisms for the multiple-good monopoly problem and provide a framework to find them, and show that a mechanism is optimal if and only if a measure μ derived from the buyer's type distribution satisfies certain stochastic dominance conditions.
Abstract: We characterize optimal mechanisms for the multiple-good monopoly problem and provide a framework to find them We show that a mechanism is optimal if and only if a measure μ derived from the buyer's type distribution satisfies certain stochastic dominance conditions This measure expresses the marginal change in the seller's revenue under marginal changes in the rent paid to subsets of buyer types As a corollary, we characterize the optimality of grand-bundling mechanisms, strengthening several results in the literature, where only sufficient optimality conditions have been derived As an application, we show that the optimal mechanism for n independent uniform items each supported on [c,c+1] is a grand-bundling mechanism, as long as c is sufficiently large, extending Pavlov's result for two items Pavlov, 2011 At the same time, our characterization also implies that, for all c and for all sufficiently large n, the optimal mechanism for n independent uniform items supported on [c,c+1] is not a grand-bundling mechanism

141 citations


Journal ArticleDOI
TL;DR: The authors study the relationship between aspirations and the distribution of income and show that extreme equality is unstable and when the same aspirations are shared in a society, polarization arises, which captures both the complacency stemming from low aspirations and frustration resulting from aspirations that are too high.
Abstract: The premise of this paper is twofold. First, people’s aspirations for their future wellbeing (or that of their children) aect their incentives to invest. Second, the experiences of others help shape one’s aspirations. This paper marries a model of aspirations-based choice with a simple theory of aspirations formation to study the relationship between aspirations and the distribution of income. Through its impact on investments, aspirations aect economic mobility and the income distribution, which in turn shape aspirations. Thus aspirations, income, and the distribution of income evolve jointly, and in many situations in a self-reinforcing way. We study the consequences of this model for income distribution as well as for growth rates over dierent quantiles of the distribution. We show that extreme equality is unstable. Moreover, when the same aspirations are shared in a society, polarization arises. The theory we propose captures both the complacency stemming from low aspirations and the frustration resulting from aspirations that are too high. As a result, for commonly held aspirations, growth rates have an inverted U-shape along the income distribution.

139 citations


Journal ArticleDOI
TL;DR: This article developed a theory of intergenerational transmission of preferences that rationalizes the choice between alternative parenting styles (as set out in Baumrind1967), which is consistent with the decline of authoritarian parenting observed in industrialized countries, and with the greater prevalence of more permissive parenting in countries characterized by low inequality.
Abstract: We develop a theory of intergenerational transmission of preferences that rationalizes the choice between alternative parenting styles (as set out in Baumrind1967). ParentsmaximizeanobjectivefunctionthatcombinesBeckerian altruism and paternalism towards children. They can affect their children’s choices via two channels: either by influencing children’s preferences or by imposing direct restrictions on their choice sets. Different parenting styles (authoritarian, authoritative, and permissive) emerge as equilibrium outcomes, and are affected both by parental preferences and by the socioeconomic environment. Parenting style, in turn, feeds back into the children’s welfare and economic success. The theory is consistent with the decline of authoritarian parenting observed in industrialized countries, and with the greater prevalence of more permissive parenting in countries characterized by low inequality.

139 citations


Journal ArticleDOI
TL;DR: An empirical model of network formation, combining strategic and random networks features, and provides new identification results for ERGMs in large networks: if link externalities are nonnegative, the ERGM is asymptotically indistinguishable from an Erdős–Renyi model with independent links.
Abstract: This paper proposes an empirical model of network formation, combining strategic and random networks features. Payoffs depend on direct links, but also link externalities. Players meet sequentially at random, myopically updating their links. Under mild assumptions, the network formation process is a potential game and converges to an exponential random graph model (ERGM), generating directed dense networks. I provide new identification results for ERGMs in large networks: if link externalities are nonnegative, the ERGM is asymptotically indistinguishable from an Erdős–Renyi model with independent links. We can identify the parameters only when at least one of the externalities is negative and sufficiently large. However, the standard estimation methods for ERGMs can have exponentially slow convergence, even when the model has asymptotically independent links. I thus estimate parameters using a Bayesian MCMC method. When the parameters are identifiable, I show evidence that the estimation algorithm converges in almost quadratic time.

Journal ArticleDOI
TL;DR: In this article, the authors solve a general class of dynamic rational-inattention problems in which an agent repeatedly acquires costly information about an evolving state and selects actions, and the solution resembles the choice rule in a dynamic logit model, but it is biased towards an optimal default rule that does not depend on the realized state.
Abstract: We solve a general class of dynamic rational-inattention problems in which an agent repeatedly acquires costly information about an evolving state and selects actions The solution resembles the choice rule in a dynamic logit model, but it is biased towards an optimal default rule that does not depend on the realized state We apply the general solution to the study of (i) the sunk-cost fallacy; (ii) inertia in actions leading to lagged adjustments to shocks; and (iii) the tradeoff between accuracy and delay in decision-making

ReportDOI
TL;DR: In this article, the authors introduce a model of undirected dyadic link formation which allows for assortative matching on observed agent characteristics (homophily) as well as unrestricted agent-level heterogeneity in link surplus (degree heterogeneity).
Abstract: I introduce a model of undirected dyadic link formation which allows for assortative matching on observed agent characteristics (homophily) as well as unrestricted agent‐level heterogeneity in link surplus (degree heterogeneity). Like in fixed effects panel data analyses, the joint distribution of observed and unobserved agent‐level characteristics is left unrestricted. Two estimators for the (common) homophily parameter, β 0 , are developed and their properties studied under an asymptotic sequence involving a single network growing large. The first, tetrad logit (TL), estimator conditions on a sufficient statistic for the degree heterogeneity. The second, joint maximum likelihood (JML), estimator treats the degree heterogeneity {A i0 } i = 1 -super-N as additional (incidental) parameters to be estimated. The TL estimate is consistent under both sparse and dense graph sequences, whereas consistency of the JML estimate is shown only under dense graph sequences.

Journal ArticleDOI
TL;DR: In this paper, the authors studied the impact of time-varying idiosyncratic risk at the establishment level on unemployment fluctuations over 1972-2009 and built a tractable directed search model with firm dynamics.
Abstract: This paper studies the impact of time-varying idiosyncratic risk at the establishment level on unemployment fluctuations over 1972–2009. I build a tractable directed search model with firm dynamics and time-varying idiosyncratic volatility. The model allows for endogenous separations, entry and exit, and job-to-job transitions. I show that the model can replicate salient features of the microeconomic behavior of firms and that the introduction of volatility improves the fit of the model for standard business cycle moments. In a series of counterfactual experiments, I show that time-varying risk is important to account for the magnitude of fluctuations in aggregate unemployment for past U.S. recessions. Though the model can account for about 40% of the total increase in unemployment for the 2007–2009 recession, uncertainty alone is not sufficient to explain the magnitude and persistence of unemployment during that episode.

Journal ArticleDOI
TL;DR: In this article, the authors estimate a model of employer-insurer and hospitalinsurer bargaining over premiums and reimbursements, household demand for insurance, and individual demand for hospitals using detailed California admissions, claims, and enrollment data.
Abstract: The impact of insurer competition on welfare, negotiated provider prices, and premiums in the U.S. private health care industry is theoretically ambiguous. Reduced competition may increase the premiums charged by insurers and their payments made to hospitals. However, it may also strengthen insurers' bargaining leverage when negotiating with hospitals, thereby generating offsetting cost decreases. To understand and measure this trade-off, we estimate a model of employer-insurer and hospital-insurer bargaining over premiums and reimbursements, household demand for insurance, and individual demand for hospitals using detailed California admissions, claims, and enrollment data. We simulate the removal of both large and small insurers from consumers' choice sets. Although consumer welfare decreases and premiums typically increase, we find that premiums can fall upon the removal of a small insurer if an employer imposes effective premium constraints through negotiations with the remaining insurers. We also document substantial heterogeneity in hospital price adjustments upon the removal of an insurer, with renegotiated price increases and decreases of as much as 10% across markets.

Journal ArticleDOI
TL;DR: Conditions under which the same construction can be used to construct tests that asymptotically control the probability of a false rejection whenever the distribution of the observed data exhibits approximate symmetry in the sense that the limiting distribution of a function of the data exhibits symmetry under the null hypothesis are provided.
Abstract: This paper develops a theory of randomization tests under an approximate symmetry assumption. Randomization tests provide a general means of constructing tests that control size in nite samples whenever the distribution of the observed data exhibits symmetry under the null hypothesis. Here, by exhibits symmetry we mean that the distribution remains invariant under a group of transformations. In this paper, we provide conditions under which the same construction can be used to construct tests that asymptotically control the probability of a false rejection whenever the distribution of the observed data exhibits approximate symmetry in the sense that the limiting distribution of a function of the data exhibits symmetry under the null hypothesis. An important application of this idea is in settings where the data may be grouped into a xed number of \clusters" with a large number of observations within each cluster. In such settings, we show that the distribution of the observed data satises our approximate symmetry requirement under weak assumptions. In particular, our results allow for the clusters to be heterogeneous and also have dependence not only within each cluster, but also across clusters. This approach enjoys several advantages over other approaches in these settings. Among other things, it leads to a test that is asymptotically similar, which, as shown in a simulation study, translates into improved power at many alternatives. Finally, we use our results to revisit the analysis of Angrist and Lavy (2009), who examine the impact of a cash award on exam performance for low-achievement students in Israel.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a perfectly competitive model of a market with adverse selection, where prices are determined by zero-profit conditions, and the set of traded contracts is determined by free entry.
Abstract: This paper proposes a perfectly competitive model of a market with adverse selection. Prices are determined by zero-profit conditions, and the set of traded contracts is determined by free entry. Crucially for applications, contract characteristics are endogenously determined, consumers may have multiple dimensions of private information, and an equilibrium always exists. Equilibrium corresponds to the limit of a differentiated products Bertrand game. We apply the model to establish theoretical results on the equilibrium effects of mandates. Mandates can increase efficiency but have unintended consequences. With adverse selection, an insurance mandate reduces the price of low-coverage policies, which necessarily has indirect effects such as increasing adverse selection on the intensive margin and causing some consumers to purchase less coverage.

Journal ArticleDOI
TL;DR: It is shown theoretically that all parameters of the classic model of sorting based on absolute advantage in Becker (1973) with search frictions can be identified using only matched employer-employee data on wages and labor market transitions.
Abstract: We assess the empirical content of equilibrium models of labor market sorting based on unobserved (to economists) characteristics. In particular, we show theoretically that all parameters of the classic model of sorting based on absolute advantage in Becker, 1973 with search frictions can be nonparametrically identified using only matched employer–employee data on wages and labor market transitions. In particular, these data are sufficient to nonparametrically estimate the output of any individual worker with any given firm. Our identification proof is constructive and we provide computational algorithms that implement our identification strategy given the limitations of the available data sets. Finally, we add on-the-job search to the model, extend the identification strategy, and apply it to a large German matched employer–employee data set to describe detailed patterns of sorting and properties of the production function.

ReportDOI
TL;DR: In this paper, a method to correct sample selection in quantile regression models is proposed, where the cumulative distribution function of the percentile error in the outcome equation and the error in participation decision is modelled via the copula, which is estimated by minimizing a method-of-moments criterion.
Abstract: We propose a method to correct for sample selection in quantile regression models. Selection is modelled via the cumulative distribution function, or copula, of the percentile error in the outcome equation and the error in the participation decision. Copula parameters are estimated by minimizing a method-of-moments criterion. Given these parameter estimates, the percentile levels of the outcome are re-adjusted to correct for selection, and quantile parameters are estimated by minimizing a rotated "check" function. We apply the method to correct wage percentiles for selection into employment, using data for the UK for the period 1978-2000. We also extend the method to account for the presence of equilibrium effects when performing counterfactual exercises.

Journal ArticleDOI
TL;DR: In this article, the authors study from both a theoretical and an empirical perspective how a network of military alliances and enmities affects the intensity of a conflict and obtain a closed-form characterization of the Nash equilibrium.
Abstract: We study from both a theoretical and an empirical perspective how a network of military alliances and enmities affects the intensity of a conflict. The model combines elements from network theory and from the politico-economic theory of conflict. We obtain a closed-form characterization of the Nash equilibrium. Using the equilibrium conditions, we perform an empirical analysis using data on the Second Congo War, a conflict that involves many groups in a complex network of informal alliances and rivalries. The estimates of the fighting externalities are then used to infer the extent to which the conflict intensity can be reduced through (i) dismantling specific fighting groups involved in the conflict; (ii) weapon embargoes; (iii) interventions aimed at pacifying animosity among groups. Finally, with the aid of a random utility model, we study how policy shocks can induce a reshaping of the network structure.

Journal ArticleDOI
TL;DR: This work generalizes the Gale-Eisenberg Theorem to a mixed manna and results in the practically important case of linear preferences, where the axiomatic comparison between the division of goods and that of bads is especially sharp.
Abstract: A mixed manna contains goods (that everyone likes) and bads (that everyone dislikes), as well as items that are goods to some agents, but bads or satiated to others. If all items are goods and utility functions are homogeneous of degree 1 and concave (and monotone), the competitive division maximizes the Nash product of utilities (Gale–Eisenberg): hence it is welfarist (determined by the set of feasible utility profiles), unique, continuous, and easy to compute. We show that the competitive division of a mixed manna is still welfarist. If the zero utility profile is Pareto dominated, the competitive profile is strictly positive and still uniquely maximizes the product of utilities. If the zero profile is unfeasible (for instance, if all items are bads), the competitive profiles are strictly negative and are the critical points of the product of disutilities on the efficiency frontier. The latter allows for multiple competitive utility profiles, from which no single-valued selection can be continuous or resource monotonic. Thus the implementation of competitive fairness under linear preferences in interactive platforms like SPLIDDIT will be more difficult when the manna contains bads that overwhelm the goods.

ReportDOI
TL;DR: This paper uses tools from random set theory to study identification in Generalised Instrumental Variable models and provides a sharp characterisation of the identified set of structures admitted, and demonstrates the application of the analysis to a continuous outcome model with an interval-censored endogenous explanatory variable.
Abstract: This paper develops characterizations of identified sets of structures and structural features for complete and incomplete models involving continuous or discrete variables. Multiple values of unobserved variables can be associated with particular combinations of observed variables. This can arise when there are multiple sources of heterogeneity, censored or discrete endogenous variables, or inequality restrictions on functions of observed and unobserved variables. The models generalize the class of incomplete instrumental variable (IV) models in which unobserved variables are singlevalued functions of observed variables. Thus the models are referred to as generalized IV (GIV) models, but there are important cases in which instrumental variable restrictions play no significant role. Building on a definition of observational equivalence for incomplete models the development uses results from random set theory that guarantee that the characterizations deliver sharp bounds, thereby dispensing with the need for case-by-case proofs of sharpness. The use of random sets defined on the space of unobserved variables allows identification analysis under mean and quantile independence restrictions on the distributions of unobserved variables conditional on exogenous variables as well as under a full independence restriction. The results are used to develop sharp bounds on the distribution of valuations in an incomplete model of English auctions, improving on the pointwise bounds available until now. Application of many of the results of the paper requires no familiarity with random set theory.

Journal ArticleDOI
TL;DR: In this article, the authors show that the way in which democratic transitions unfold is a key determinant of the extent of elite capture, and that slower transitions towards democracy allow the old-regime elites to capture democracy.
Abstract: Democracies widely differ in the extent to which powerful elites and interest groups retain influence over politics. While a large literature argues that elite capture is rooted in a country's history, our understanding of the determinants of elite persistence is limited. In this paper, we show that the way in which democratic transitions unfold is a key determinant of the extent of elite capture. We exploit quasi-random variation that originated during the Indonesian transition: Soeharto-regime mayors were allowed to finish their terms before being replaced by new leaders. Since mayors' political cycles were not synchronized, this event generated exogenous variation in how long old-regime mayors remained in their position during the democratic transition. Districts with longer exposure to old-regime mayors experience worse governance outcomes, higher elite persistence, and lower political competition in the medium-run. The results suggest that slower transitions towards democracy allow the old-regime elites to capture democracy.

Journal ArticleDOI
TL;DR: This paper develops easily-implemented empirical strategies that fully exploit the random assignment embedded in a wide class of mechanisms, while also revealing why seats are randomized at one school but not another, and shows how to use centralized assignment mechanisms to identify causal effects in models with multiple school sectors.
Abstract: A growing number of school districts use centralized assignment mechanisms to allocate school seats in a manner that reflects student preferences and school priorities. Many of these assignment schemes use lotteries to ration seats when schools are oversubscribed. The resulting random assignment opens the door to credible quasi-experimental research designs for the evaluation of school effectiveness. Yet the question of how best to separate the lottery-generated randomization integral to such designs from non-random preferences and priorities remains open. This paper develops easily-implemented empirical strategies that fully exploit the random assignment embedded in a wide class of mechanisms, while also revealing why seats are randomized at one school but not another. We use these methods to evaluate charter schools in Denver, one of a growing number of districts that combine charter and traditional public schools in a unified assignment system. The resulting estimates show large achievement gains from charter school attendance. Our approach generates efficiency gains over ad hoc methods, such as those that focus on schools ranked first, while also identifying a more representative average causal effect. We also show how to use centralized assignment mechanisms to identify causal effects in models with multiple school sectors.

Journal ArticleDOI
TL;DR: In this article, the authors provide the first theoretical analysis of altruism in networks and uncover four main features of this interdependence: there is a unique profile of incomes after transfers, for any network and any utility functions, and there is no waste in transfers in equilibrium.
Abstract: We provide the first theoretical analysis of altruism in networks. Agents are embedded in a fixed, weighted network and care about their direct friends. Given some initial distribution of incomes, they may decide to support their poorer friends. We study the resulting non-cooperative transfer game. Our analysis highlights the importance of indirect gifts, where an agent gives to a friend because his friend himself has a friend in need. We uncover four main features of this interdependence. First, we show that there is a unique profile of incomes after transfers, for any network and any utility functions. Uniqueness in transfers holds on trees, but not on arbitrary networks. Second, there is no waste in transfers in equilibrium. In particular, transfers flow through indirect paths of highest altruistic strength. Third, a negative shock on one agent cannot benefit others and tends to affect socially closer agents first. In addition, an income redistribution that decreases inequality ex-ante can increase inequality ex-post. Fourth, altruistic networks decrease income inequality. In contrast, more altruistic or more homophilous networks can increase inequality.

Journal ArticleDOI
TL;DR: In this article, a single-crossing random utility model (SCRUM) is proposed, in which the collection of preferences satisfies the single crossing property, and a characterization of SCRUMs based on the classic Monotonicity property and a novel condition, Centrality is given.
Abstract: We propose a novel model of stochastic choice: the single-crossing random utility model (SCRUM). This is a random utility model in which the collection of preferences satisfies the single-crossing property. We offer a characterization of SCRUMs based on two easy-to-check properties: the classic Monotonicity property and a novel condition, Centrality. The identified collection of preferences and associated probabilities is unique. We show that SCRUMs nest both single-peaked and single-dipped random utility models and establish a stochastic monotone comparative result for the case of SCRUMs.

Journal ArticleDOI
TL;DR: In this article, the impact of private information in sealed-bid first-price auctions is explored and lower bounds for bids and revenue with asymmetric prior distributions over values are derived for a given symmetric and arbitrarily correlated prior distribution over values.
Abstract: We explore the impact of private information in sealed-bid first-price auctions. For a given symmetric and arbitrarily correlated prior distribution over values, we characterize the lowest winning-bid distribution that can arise across all information structures and equilibria. The information and equilibrium attaining this minimum leave bidders indifferent between their equilibrium bids and all higher bids. Our results provide lower bounds for bids and revenue with asymmetric distributions over values. We also report further characterizations of revenue and bidder surplus including upper bounds on revenue. Our work has implications for the identification of value distributions from data on winning bids and for the informationally robust comparison of alternative auction mechanisms.

Journal ArticleDOI
TL;DR: In this paper, the authors examine how sales force impacts competition and equilibrium prices in the context of a privatized pension market and find that exposure to sales force lowered price sensitivity, leading to inelastic demand and high equilibrium fees.
Abstract: This paper examines how sales force impacts competition and equilibrium prices in the context of a privatized pension market We use detailed administrative data on fund manager choices and worker characteristics at the inception of Mexico's privatized social security system, where fund managers had to set prices (management fees) at the national level, but could select sales force levels by local geographic areas We develop and estimate a model of fund manager choice where sales force can increase or decrease customer price sensitivity We find exposure to sales force lowered price sensitivity, leading to inelastic demand and high equilibrium fees We simulate oft proposed policy solutions: a supply‐side policy with a competitive government player and a demand‐side policy that increases price elasticity We find that demand‐side policies are necessary to foster competition in social safety net markets with large segments of inelastic consumers

Journal ArticleDOI
TL;DR: In this paper, the estimation of (joint) moments of microstructure noise based on high frequency data is conducted under a nonparametric setting, which allows the underlying price process to have jumps, the observation times to be irregularly spaced, and the noise to be dependent on the price process and to have diurnal features.
Abstract: We study the estimation of (joint) moments of microstructure noise based on high frequency data. The estimation is conducted under a nonparametric setting, which allows the underlying price process to have jumps, the observation times to be irregularly spaced, and the noise to be dependent on the price process and to have diurnal features. Estimators of arbitrary orders of (joint) moments are provided, for which we establish consistency as well as central limit theorems. In particular, we provide estimators of autocovariances and autocorrelations of the noise. Simulation studies demonstrate excellent performance of our estimators in the presence of jumps, irregular observation times, and even rounding. Empirical studies reveal (moderate) positive autocorrelations of microstructure noise for the stocks tested.

Journal ArticleDOI
TL;DR: In this paper, the authors present a scalable method for computing global solutions of high-dimensional stochastic dynamic models using an adaptive sparse grid algorithm, where grid points are added only where they are most needed.
Abstract: We present a exible and scalable method for computing global solutions of high-dimensional stochastic dynamic models. Within a time iteration or value function iteration setup, we interpolate functions using an adaptive sparse grid algorithm. With increasing dimensions, sparse grids grow much more slowly than standard tensor product grids. Moreover, adaptivity adds a second layer of sparsity, as grid points are added only where they are most needed, for instance in regions with steep gradients or at non-differentiabilities. To further speed up the solution process, our implementation is fully hybrid parallel, combining distributed and shared memory parallelization paradigms, and thus permits an efficient use of high-performance computing architectures. To demonstrate the broad applicability of our method, we solve two very different types of dynamic models: first, high-dimensional international real business cycle models with capital adjustment costs and irreversible investment; second, multiproduct menu-cost models with temporary sales and economies of scope in price setting.