scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 2016"


Journal ArticleDOI
TL;DR: In this article, the authors examine how prices, markups and marginal costs respond to trade liberalization and find that firms offset their reducussions in marginal costs by raising markups.
Abstract: This paper examines how prices, markups and marginal costs respond to trade liberalizaƟon. We develop a framework to esƟmate markups from producƟon data with mulƟ-product firms. This approach does not require assumpƟons on the market structure or demand curves faced by firms, nor assumpƟons on how firms allocate their inputs across products. We exploit quanƟty and price informaƟon to disentangle markups from quanƟty-based producƟvity, and then compute marginal costs by dividing observed prices by the esƟmated markups. We use India’s trade liberalizaƟon episode to examine how firms adjust these performance measures. Not surprisingly, we find that trade liberalizaƟon lowers factory-gate prices and that output tariff declines have the expected pro-compeƟƟve effects. However, the price declines are small relaƟve to the declines in marginal costs, which fall predominantly because of the input tariff liberalizaƟon. The reason is that firms offset their reducƟons in marginal costs by raising markups. Our results demonstrate substanƟal heterogeneity and variability in markups across firms and Ɵme and suggest that producers benefited relaƟve to consumers, at least immediately aŌer the reforms. Long-term gains to consumers may be higher to the extent that higher firm profits lead to new product introducƟons and growth. Indeed, firms with larger increases in markups had a higher propensity to introduce new products during this period.

492 citations


Journal ArticleDOI
TL;DR: In this article, the authors derived the large sample distribution of propensity score matching estimators, taking into account that the propensity score is itself estimated in a first step, prior to matching, and they showed that first step estimation of the propensity scores affects the distribution of the matching estimator.
Abstract: Propensity score matching estimators (Rosenbaum and Rubin (1983)) are widely used in evaluation research to estimate average treatment effects. In this article, we derive the large sample distribution of propensity score matching estimators. Our derivations take into account that the propensity score is itself estimated in a first step, prior to matching. We prove that first step estimation of the propensity score affects the large sample distribution of propensity score matching estimators, and derive adjustments to the large sample variances of propensity score matching estimators of the average treatment effect (ATE) and the average treatment effect on the treated (ATET). The adjustment for the ATE estimator is negative (or zero in some special cases), implying that matching on the estimated propensity score is more efficient than matching on the true propensity score in large samples. However, for the ATET estimator, the sign of the adjustment term depends on the data generating process, and ignoring the estimation error in the propensity score may lead to confidence intervals that are either too large or too small.

427 citations


Journal ArticleDOI
TL;DR: The authors show that standard theories, which build on a random growth mechanism, generate transition dynamics that are too slow relative to those observed in the data and suggest two parsimonious deviations from the canonical model that can explain such changes: scale dependence that may arise from changes in skill prices and type dependence, that is, the presence of some high-growth types.
Abstract: The past forty years have seen a rapid rise in top income inequality in the United States While there is a large number of existing theories of the Pareto tail of the long-run income distributions, almost none of these address the fast rise in top inequality observed in the data We show that standard theories, which build on a random growth mechanism, generate transition dynamics that are too slow relative to those observed in the data We then suggest two parsimonious deviations from the canonical model that can explain such changes: “scale dependence” that may arise from changes in skill prices, and “type dependence,” that is, the presence of some “high-growth types” These deviations are consistent with theories in which the increase in top income inequality is driven by the rise of “superstar” entrepreneurs or managers

297 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify seven potential stabilizers in the data and include four theoretical channels through which they may operate in a business cycle model calibrated to the U.S. data.
Abstract: Most countries have automatic rules in their tax-and-transfer systems that are partly intended to stabilize economic uctuations. This paper measures how eective they are at lowering the volatility of U.S. economic activity. We identify seven potential stabilizers in the data and include four theoretical channels through which they may operate in a business cycle model calibrated to the U.S. data. The model is used to compare the volatility of output in the data with counterfactuals where some, or all, of the stabilizers are shut down. Our rst nding is that proportional taxes, like sales, property and corporate income taxes, contribute little to stabilization. Our second

266 citations


Journal ArticleDOI
TL;DR: This article developed and estimated a general equilibrium search and matching model that accounts for key business cycle properties of macroeconomic aggregates, including labor market variables, and derived wage inertia from a specification of how firms and workers negotiate wages.
Abstract: We develop and estimate a general equilibrium search and matching model that accounts for key business cycle properties of macroeconomic aggregates, including labor market variables. In sharp contrast to leading New Keynesian models, we do not impose wage inertia. Instead we derive wage inertia from our specification of how firms and workers negotiate wages. Our model outperforms a variant of the standard New Keynesian Calvo sticky wage model. According to our estimated model, there is a critical interaction between the degree of price stickiness, monetary policy, and the duration of an increase in unemployment benefits.

254 citations


Journal ArticleDOI
TL;DR: In this article, a new attention allocation model that uses the state of the business cycle to predict information choices, which in turn predict observable patterns of portfolio investments and returns is developed.
Abstract: The question of whether and how mutual fund managers provide valuable services for their clients motivates one of the largest literatures in finance. One candidate explanation is that funds process information about future asset values and use that information to invest in high-valued assets. But formal theories are scarce because information choice models with many assets are difficult to solve as well as difficult to test. This paper tackles both problems by developing a new attention allocation model that uses the state of the business cycle to predict information choices, which in turn, predict observable patterns of portfolio investments and returns. The predictions about fund portfolios' covariance with payoff shocks, cross-fund portfolio and return dispersion, and their excess returns are all supported by the data. These findings offer new evidence that some investment managers have skill and that attention is allocated rationally.

212 citations


Journal ArticleDOI
TL;DR: In this article, the authors estimate a dynamic model of employment, human capital accumulation, and savings for women in the United Kingdom, exploiting tax and benefit reforms, and use it to analyze the effects of welfare policy.
Abstract: We estimate a dynamic model of employment, human capital accumulation—including education, and savings for women in the United Kingdom, exploiting tax and benefit reforms, and use it to analyze the effects of welfare policy. We find substantial elasticities for labor supply and particularly for lone mothers. Returns to experience, which are important in determining the longer-term effects of policy, increase with education, but experience mainly accumulates when in full-time employment. Tax credits are welfare improving in the U.K., increase lone-mother labor supply and marginally reduce educational attainment, but the employment effects do not extend beyond the period of eligibility. Marginal increases in tax credits improve welfare more than equally costly increases in income support or tax cuts.

204 citations


Journal ArticleDOI
TL;DR: This article developed an econometric methodology to infer the path of risk premia from a large unbalanced panel of individual stock returns, using simple weighted two-pass cross-sectional regressions.
Abstract: We develop an econometric methodology to infer the path of risk premia from a large unbalanced panel of individual stock returns. We estimate the time-varying risk premia implied by conditional linear asset pricing models where the conditioning includes both instruments common to all assets and asset-specific instruments. The estimator uses simple weighted two-pass cross-sectional regressions, and we show its consistency and asymptotic normality under increasing cross-sectional and time series dimensions. We address consistent estimation of the asymptotic variance by hard thresholding, and testing for asset pricing restrictions induced by the no-arbitrage assumption. We derive the restrictions given by a continuum of assets in a multi-period economy under an approximate factor structure robust to asset repackaging. The empirical analysis on returns for about ten thousand U.S. stocks from July 1964 to December 2009 shows that risk premia are large and volatile in crisis periods. They exhibit large positive and negative strays from time-invariant estimates, follow the macroeconomic cycles, and do not match risk premia estimates on standard sets of portfolios. The asset pricing restrictions are rejected for a conditional four-factor model capturing market, size, value, and momentum effects.

178 citations


Journal ArticleDOI
TL;DR: An equilibrium framework that relaxes the standard assumption that people have a correctly-specified view of their environment and introduces the notion of a Berk-Nash equilibrium, where each player follows a strategy that is optimal given her belief, and her belief is restricted to be the best fit among the set of beliefs she considers possible.
Abstract: We develop an equilibrium framework that relaxes the standard assumption that people have a correctly specified view of their environment. Each player is characterized by a (possibly misspecified) subjective model, which describes the set of feasible beliefs over payoff-relevant consequences as a function of actions. We introduce the notion of a Berk–Nash equilibrium: Each player follows a strategy that is optimal given her belief, and her belief is restricted to be the best fit among the set of beliefs she considers possible. The notion of best fit is formalized in terms of minimizing the Kullback–Leibler divergence, which is endogenous and depends on the equilibrium strategy profile. Standard solution concepts such as Nash equilibrium and self-confirming equilibrium constitute special cases where players have correctly specified models. We provide a learning foundation for Berk–Nash equilibrium by extending and combining results from the statistics literature on misspecified learning and the economics literature on learning in games.

149 citations


Journal ArticleDOI
TL;DR: The authors found that higher quotas per capita were associated with a higher probability of revolution participation after the abolition and a higher incidence of uprisings in 1911 that marked the end of the 2,000 years of imperial rule.
Abstract: This paper studies how the abolition of an elite recruitment system—China's civil exam system that lasted over 1,300 years—affects political stability. Employing a panel data set across 262 prefectures and exploring the variations in the quotas on the entry-level exam candidates, we find that higher quotas per capita were associated with a higher probability of revolution participation after the abolition and a higher incidence of uprisings in 1911 that marked the end of the 2,000 years of imperial rule. This finding is robust to various checks including using the number of small rivers and short-run exam performance before the quota system as instruments. The patterns in the data appear most consistent with the interpretation that in regions with higher quotas per capita under the exam system, more would-be elites were negatively affected by the abolition. In addition, we document that modern human capital in the form of those studying in Japan also contributed to the revolution and that social capital strengthened the effect of quotas on revolution participation.

144 citations


Journal ArticleDOI
TL;DR: In this article, a new variant of Lasso called classifier-Lasso is proposed to shrink individual coefficients to the unknown group-specific coefficients, which achieves simultaneous classification and consistent estimation in a single step.
Abstract: This paper provides a novel mechanism for identifying and estimating latent group structures in panel data using penalized techniques. We consider both linear and nonlinear models where the regression coefficients are heterogeneous across groups but homogeneous within a group and the group membership is unknown. Two approaches are considered—penalized profile likelihood (PPL) estimation for the general nonlinear models without endogenous regressors, and penalized GMM (PGMM) estimation for linear models with endogeneity. In both cases, we develop a new variant of Lasso called classifier-Lasso (C-Lasso) that serves to shrink individual coefficients to the unknown group-specific coefficients. C-Lasso achieves simultaneous classification and consistent estimation in a single step and the classification exhibits the desirable property of uniform consistency. For PPL estimation, C-Lasso also achieves the oracle property so that group-specific parameter estimators are asymptotically equivalent to infeasible estimators that use individual group identity information. For PGMM estimation, the oracle property of C-Lasso is preserved in some special cases. Simulations demonstrate good finite-sample performance of the approach in both classification and estimation. Empirical applications to both linear and nonlinear models are presented.

Journal ArticleDOI
TL;DR: This article developed a dynamic model of neighborhood choice along with a computationally light multi-step estimator to capture observed and unobserved preference heterogeneity across households and locations in a flexible way, using a newly assembled data set that matched demographic information from mortgage applications to the universe of housing transactions in the San Francisco Bay Area from 1994 to 2004.
Abstract: This paper develops a dynamic model of neighborhood choice along with a computationally light multi-step estimator. The proposed empirical framework captures observed and unobserved preference heterogeneity across households and locations in a flexible way. We estimate the model using a newly assembled data set that matches demographic information from mortgage applications to the universe of housing transactions in the San Francisco Bay Area from 1994 to 2004. The results provide the first estimates of the marginal willingness to pay for several non-marketed amenities—neighborhood air pollution, violent crime, and racial composition—in a dynamic framework. Comparing these estimates with those from a static version of the model highlights several important biases that arise when dynamic considerations are ignored.

Journal ArticleDOI
TL;DR: In this paper, a theory of monetary policy and macro-prudential interventions in financial markets is proposed, and a simple formula for the required financial interventions that depends on a small number of measurable sufficient statistics is provided.
Abstract: We propose a theory of monetary policy and macroprudential interventions in financial markets. We focus on economies with nominal rigidities in goods and labor markets and subject to constraints on monetary policy, such as the zero lower bound or fixed exchange rates. We identify an aggregate demand externality that can be corrected by macroprudential interventions in financial markets. Ex post, the distribution of wealth across agents affects aggregate demand and output. Ex ante, however, these effects are not internalized in private financial decisions. We provide a simple formula for the required financial interventions that depends on a small number of measurable sufficient statistics. We also characterize optimal monetary policy. We extend our framework to incorporate pecuniary externalities, providing a unified approach to both externalities. Finally, we provide a number of applications which illustrate the relevance of our theory. [web URL: http://onlinelibrary.wiley.com/doi/10.3982/ECTA11883/abstract]

Journal ArticleDOI
TL;DR: In this paper, a continuous-time contracting problem under hidden action is studied, where the principal has ambiguous beliefs about the project cash flows and designs a robust contract that maximizes his utility under the worst-case scenario subject to the agent's incentive and participation constraints.
Abstract: We study a continuous-time contracting problem under hidden action, where the principal has ambiguous beliefs about the project cash flows. The principal designs a robust contract that maximizes his utility under the worst-case scenario subject to the agent’s incentive and participation constraints. Robustness generates endogenous belief heterogeneity and induces a tradeoff between incentives and ambiguity sharing so that the incentive constraint does not always bind. We implement the optimal contract by cash reserves, debt, and equity. In addition to receiving ordinary dividends when cash reserves reach a threshold, outside equity holders also receive special dividends or inject cash in the cash reserves to hedge against model uncertainty and smooth dividends. Ambiguity aversion raises both the equity premium and the credit yield spread. The equity premium and the credit yield spread are state dependent and high for distressed firms with low cash reserves.

Journal ArticleDOI
TL;DR: In this article, the authors extend the model of insider trading to the case where liquidity provided by noise traders follows a general stochastic process, and they show that insiders choose to optimally wait to trade more aggressively when noise trading activity is higher, and market makers anticipate this, and adjust prices accordingly.
Abstract: We extend Kyle’s (1985) model of insider trading to the case where liquidity provided by noise traders follows a general stochastic process. Even though the level of noise trading volatility is observable, in equilibrium, measured price impact is stochastic. If noise trading volatility is mean-reverting, then the equilibrium price follows a multivariate ‘stochastic bridge’ process, which displays stochastic volatility. This is because insiders choose to optimally wait to trade more aggressively when noise trading activity is higher. In equilibrium, market makers anticipate this, and adjust prices accordingly. More private information is revealed when volatility is higher. In time series, insiders trade more aggressively, when measured price impact is lower. Therefore, execution costs to uninformed traders can be higher when price impact is lower.

ReportDOI
TL;DR: In this article, the authors consider general heterogenous demand where preferences and linear budget sets are statistically independent, and they find that the dimension of heterogeneity and the individual demand functions are not identified.
Abstract: Individual heterogeneity is an important source of variation in demand. Allowing for general heterogeneity is needed for correct welfare comparisons. We consider general heterogenous demand where preferences and linear budget sets are statistically independent. We find that the dimension of heterogeneity and the individual demand functions are not identified. We also find that the exact consumer surplus of a price change, averaged across individuals, is not identified, motivating bounds analysis. We use bounds on income effects to derive relatively simple bounds on the average surplus, including for discrete/continous choice. We also sketch an approach to bounding surplus that does not use income effect bounds. We apply the results with income effect bounds to gasoline demand. We find little sensitivity to the income effect bounds in this application.

Journal ArticleDOI
TL;DR: The authors analyzes a sequential search model with adverse selection and identifies circumstances under which prices fail to aggregate information well even when search frictions are small, and traces this to a strong form of the winner's curse.
Abstract: This paper analyzes a sequential search model with adverse selection. We study information aggregation by the price—how close the equilibrium prices are to the full-information prices—when search frictions are small. We identify circumstances under which prices fail to aggregate information well even when search frictions are small. We trace this to a strong form of the winner's curse that is present in the sequential search model. The failure of information aggregation may result in inefficient allocations.

Journal ArticleDOI
TL;DR: In this paper, market microstructure invariance is defined as the hypotheses that the distributions of risk transfers (bets) and transaction costs are constant across assets when measured per unit of business time.
Abstract: Using the intuition that financial markets transfer risks in business time, “market microstructure invariance” is defined as the hypotheses that the distributions of risk transfers (“bets”) and transaction costs are constant across assets when measured per unit of business time. The invariance hypotheses imply that bet size and transaction costs have specific, empirically testable relationships to observable dollar volume and volatility. Portfolio transitions can be viewed as natural experiments for measuring transaction costs, and individual orders can be treated as proxies for bets. Empirical tests based on a data set of 400,000+ portfolio transition orders support the invariance hypotheses. The constants calibrated from structural estimation imply specific predictions for the arrival rate of bets (“market velocity”), the distribution of bet sizes, and transaction costs.

Journal ArticleDOI
TL;DR: In this paper, an endogenous growth model is developed where each period firms invest in researching and developing new ideas, and how much depends on the technological propinquity between an idea and the firm's line of business.
Abstract: An endogenous growth model is developed where each period firms invest in researching and developing new ideas. An idea increases a firm's productivity. By how much depends on the technological propinquity between an idea and the firm's line of business. Ideas can be bought and sold on a market for patents. A firm can sell an idea that is not relevant to its business or buy one if it fails to innovate. The developed model is matched up with stylized facts about the market for patents in the United States. The analysis gauges how efficiency in the patent market affects growth.

Journal ArticleDOI
TL;DR: In this paper, the authors show that if credit is easy money and easy money is easy, then credit is not essential, and changes in debt limits are neutral, real balances respond endogenously to leave total liquidity constant.
Abstract: Do we need both money and credit? In models with explicit roles for payment instruments, we show the answer is no. If credit is easy money is useless; if credit is tight money can be essential, but then credit is irrelevant, and changes in debt limits are neutral — real balances respond endogenously to leave total liquidity constant. This is true for exogenous or endogenous policy and debt limits, secured or unsecured credit, and fairly general preferences and pricing mechanisms. While we also show how to overturn some results, the benchmark model suggests credit conditions matter less than some people think.

Journal ArticleDOI
TL;DR: In this paper, the authors estimate demand for residential broadband using high-frequency data from subscribers facing a three-part tariff and find that usage-based pricing eliminates low-value traffic.
Abstract: We estimate demand for residential broadband using high-frequency data from subscribers facing a three-part tariff. The three-part tariff makes data usage during the billing cycle a dynamic problem, thus generating variation in the (shadow) price of usage. We provide evidence that subscribers respond to this variation, and we use their dynamic decisions to estimate a flexible distribution of willingness to pay for different plan characteristics. Using the estimates, we simulate demand under alternative pricing and find that usage-based pricing eliminates low-value traffic. Furthermore, we show that the costs associated with investment in fiber-optic networks are likely recoverable in some markets, but that there is a large gap between social and private incentives to invest.

Journal ArticleDOI
TL;DR: In this article, the role of stochastic feasibility in consumer choice using a random conditional choice set rule (RCCSR) was examined, and the authors characterized the model from conditions on the availability of alternatives.
Abstract: We examine the role of stochastic feasibility in consumer choice using a random conditional choice set rule (RCCSR) and uniquely characterize the model from conditions on stochastic choice data. Feasibility is modeled to permit correlation in availability of alternatives. This provides a natural way to examine substitutability/complementarity. We show that an RCCSR generalizes the random consideration set rule of [Manzini and Mariotti, 2014]. We then relate this model to existing literature. In particular, an RCCSR is not a random utility model.

Journal ArticleDOI
TL;DR: In this paper, equilibrium quit turnover in a frictional labor market with costly hiring by firms, where large firms employ many workers and face both aggregate and firm specific productivity shocks, is considered.
Abstract: This paper considers equilibrium quit turnover in a frictional labor market with costly hiring by firms, where large firms employ many workers and face both aggregate and firm specific productivity shocks. There is exogenous firm turnover as new (small) startups enter the market over time, while some existing firms fail and exit. Individual firm growth rates are disperse and evolve stochastically. The paper highlights how dynamic monopsony, where firms trade off lower wages against higher (endogenous) employee quit rates, yields excessive job-to-job quits. Such quits directly crowd out the reemployment prospects of the unemployed. With finite firm productivity states, stochastic equilibrium is fully tractable and can be computed using standard numerical techniques.

Journal ArticleDOI
TL;DR: In this article, the authors address the question of consistency and asymptotic distributions of IV estimates of demand in a small number of markets as the number of products increases in some commonly used demand models under conditions on economic primitives.
Abstract: IO economists often estimate demand for differentiated products using data sets with a small number of large markets. By modeling demand as depending on a small number of product characteristics, one might hope to obtain increasingly precise estimates of demand parameters as the number of products in a single market grows large. In this paper, I address the question of consistency and asymptotic distributions of IV estimates of demand in a small number of markets as the number of products increases in some commonly used demand models under conditions on economic primitives. I show that, under the common assumption of a Bertrand-Nash equilibrium in prices, product characteristics lose their identifying power as price instruments in the limit in many of these models, giving inconsistent estimates in these cases. I find that consistent estimates can still be obtained for many of the cases I consider, but care must be taken in modeling demand and choosing instruments. For cases where consistent estimates can be obtained, I provide sufficient conditions for consistency and asymptotic normality of estimates of parameters and counterfactual outcomes. A monte carlo study confirms that the asymptotic results provide an accurate description

Journal ArticleDOI
TL;DR: In this paper, competitive equilibria of economies where assets are heterogeneous and traders have heterogeneous information about them are studied. And the model can be applied to find conditions under which these economies feature fire sales, contagion, and flights to quality.
Abstract: This paper studies competitive equilibria of economies where assets are heterogeneous and traders have heterogeneous information about them. Markets are defined by a price and a procedure for clearing trades, and any asset can, in principle, be traded in any market. Buyers can use their information to impose acceptance rules which specify which assets they are willing to trade in each market. The set of markets where trade takes place is derived endogenously. The model can be applied to find conditions under which these economies feature fire sales, contagion, and flights to quality.

Journal ArticleDOI
TL;DR: In the context of probabilistic social choice, this article showed that consistency with respect to a variable electorate and consistency of components of similar alternatives uniquely characterize a function proposed by Fishburn (1984), which returns maximal lotteries that correspond to optimal mixed strategies in the symmetric zero-sum game induced by the pairwise majority margins.
Abstract: Two fundamental axioms in social choice theory are consistency with respect to a variable electorate and consistency with respect to components of similar alternatives. In the context of traditional non-probabilistic social choice, these axioms are incompatible with each other. We show that in the context of probabilistic social choice, these axioms uniquely characterize a function proposed by Fishburn (1984). Fishburn's function returns so-called maximal lotteries, that is, lotteries that correspond to optimal mixed strategies in the symmetric zero-sum game induced by the pairwise majority margins. Maximal lotteries are guaranteed to exist due to von Neumann's Minimax Theorem, are almost always unique, and can be efficiently computed using linear programming. [web URL: http://onlinelibrary.wiley.com/doi/10.3982/ECTA13337/abstract]

ReportDOI
TL;DR: In this article, a framework for identifying preferences in a large network under the assumption of pairwise stability of network links is presented, where the observed proportions of various possible payoff-relevant local network structures are used to learn about the underlying parameters.
Abstract: This paper provides a framework for identifying preferences in a large network under the assumption of pairwise stability of network links. Network data present difficulties for identification, especially when links between nodes in a network can be interdependent: e.g., where indirect connections matter. Given a preference specification, we use the observed proportions of various possible payoff-relevant local network structures to learn about the underlying parameters. We show how one can map the observed proportions of these local structures to sets of parameters that are consistent with the model and the data. Our main result provides necessary conditions for parameters to belong to the identified set, and this result holds for a wide class of models. We also provide sufficient conditions - and hence a characterization of the identified set - for two empirically relevant classes of specifications. An interesting feature of our approach is the use of the economic model under pairwise stability as a vehicle for effective dimension reduction. The paper then provides a quadratic programming algorithm that can be used to construct the identified sets. This algorithm is illustrated with a pair of simulation exercises.

Journal ArticleDOI
TL;DR: In this paper, the authors test for the existence of housing bubbles associated with a failure of the transversality condition that requires the present value of payments occurring infinitely far in the future to be zero.
Abstract: We test for the existence of housing bubbles associated with a failure of the transversality condition that requires the present value of payments occurring infinitely far in the future to be zero. The most prominent such bubble is the classic rational bubble. We study housing markets in the United Kingdom and Singapore, where residential property ownership takes the form of either leaseholds or freeholds. Leaseholds are finite-maturity, pre-paid, and tradeable ownership contracts with maturities often exceeding 700 years. Freeholds are infinite-maturity ownership contracts. The price difference between leaseholds with extremely-long maturities and freeholds reflects the present value of a claim to the freehold after leasehold expiry, and is thus a direct empirical measure of the transversality condition. We estimate this price difference, and find no evidence of failures of the transversality condition in housing markets in the U.K. and Singapore, even during periods when a sizable bubble was regularly thought to be present.

Journal ArticleDOI
TL;DR: In this article, the authors used multiple waves of longitudinal survey data from Central Java, Indonesia, to test a key prediction of the recursive model: demand for farm labor is unrelated to the demographic composition of the farm household.
Abstract: The farm household model has played a central role in improving the understanding of small-scale agricultural households and non-farm enterprises. Under the assumptions that all current and future markets exist and that farmers treat all prices as given, the model simplifies households' simultaneous production and consumption decisions into a recursive form in which production can be treated as independent of preferences of household members. These assumptions, which are the foundation of a large literature in labor and development, have been tested and not rejected in several important studies including Benjamin (1992). Using multiple waves of longitudinal survey data from Central Java, Indonesia, this paper tests a key prediction of the recursive model: demand for farm labor is unrelated to the demographic composition of the farm household. The prediction is unambiguously rejected. The rejection cannot be explained by contamination due to unobserved heterogeneity that is fixed at the farm level, local area shocks or farm-specific shocks that affect changes in household composition and farm labor demand. We conclude that the recursive form of the farm household model is not consistent with the data. Developing empirically tractable models of farm households when markets are incomplete remains an important challenge.

Journal ArticleDOI
TL;DR: In this article, the authors introduce the class of conditional linear combination tests, which reject null hypotheses concerning model parameters when a data-dependent convex combination of two identification-robust statistics is large.
Abstract: We introduce the class of conditional linear combination tests, which reject null hypotheses concerning model parameters when a data-dependent convex combination of two identification-robust statistics is large. These tests control size under weak identification and have a number of optimality properties in a conditional problem. We show that the conditional likelihood ratio test of Moreira, 2003 is a conditional linear combination test in models with one endogenous regressor, and that the class of conditional linear combination tests is equivalent to a class of quasi-conditional likelihood ratio tests. We suggest using minimax regret conditional linear combination tests and propose a computationally tractable class of tests that plug in an estimator for a nuisance parameter. These plug-in tests perform well in simulation and have optimal power in many strongly identified models, thus allowing powerful identification-robust inference in a wide range of linear and nonlinear models without sacrificing efficiency if identification is strong.