scispace - formally typeset
Search or ask a question

Showing papers in "Econometrica in 2009"


ReportDOI
TL;DR: In this article, a model with a time-varying second moment is proposed to simulate a macro uncertainty shock, which produces a rapid drop and rebound in aggregate output and employment.
Abstract: Uncertainty appears to jump up after major shocks like the Cuban Missile crisis, the assassination of JFK, the OPEC I oil-price shock, and the 9/11 terrorist attacks. This paper offers a structural framework to analyze the impact of these uncertainty shocks. I build a model with a time-varying second moment, which is numerically solved and estimated using firm-level data. The parameterized model is then used to simulate a macro uncertainty shock, which produces a rapid drop and rebound in aggregate output and employment. This occurs because higher uncertainty causes firms to temporarily pause their investment and hiring. Productivity growth also falls because this pause in activity freezes reallocation across units. In the medium term the increased volatility from the shock induces an overshoot in output, employment, and productivity. Thus, uncertainty shocks generate short sharp recessions and recoveries. This simulated impact of an uncertainty shock is compared to vector autoregression estimations on actual data, showing a good match in both magnitude and timing. The paper also jointly estimates labor and capital adjustment costs (both convex and nonconvex). Ignoring capital adjustment costs is shown to lead to substantial bias, while ignoring labor adjustment costs does not.

2,256 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider large N and large T panel data models with unobservable multiple interactive eects and derive the rate of convergence and the limiting distribution of the interactive-eect s estimator of the common slope coecients.
Abstract: This paper considers large N and large T panel data models with unobservable multiple interactive eects. These models are useful for both micro and macro econometric modelings. In earnings studies, for example, workers’ motivation, persistence, and diligence combined to influence the earnings in addition to the usual argument of innate ability. In macroeconomics, the interactive eects represent unobservable common shocks and their heterogeneous responses over cross sections. Since the interactive eects are allowed to be correlated with the regressors, they are treated as fixed eects parameters to be estimated along with the common slope coecients. The model is estimated by the least squares method, which provides the interactive-eect s counterpart of the within estimator. We first consider model identification, and then derive the rate of convergence and the limiting distribution of the interactive-eect s estimator of the common slope coecients. The estimator is shown to be p NT consistent. This rate is valid even in the presence of correlations and heteroskedasticities in both dimensions, a striking contrast with fixed T framework in which serial correlation and heteroskedasticity imply unidentification. The asymptotic distribution is not necessarily centered at zero. Biased corrected estimators are derived. We also derive the constrained estimator and its limiting distribution, imposing additivity coupled with interactive eects. The problem of testing additive versus interactive eects is also studied. We also derive identification conditions for models with grand mean, time-invariant regressors, and common regressors. It is shown that there exists a set of necessary and sucient identification conditions for those models. Given identification, the rate of convergence and limiting results continue to hold.

1,219 citations


Journal ArticleDOI
TL;DR: The authors developed a model of friendship formation that sheds light on segregation patterns observed in social and economic networks Individuals have types and see type-dependent benefits from friendships, and examine the properties of a steady-state equilibrium of a matching process of friendship forming.
Abstract: We develop a model of friendship formation that sheds light on segregation patterns observed in social and economic networks Individuals have types and see type-dependent benefits from friendships We examine the properties of a steady-state equilibrium of a matching process of friendship formation We use the model to understand three empirical patterns of friendship formation: (i) larger groups tend to form more same-type ties and fewer other-type ties than small groups, (ii) larger groups form more ties per capita, and (iii) all groups are biased towards same-type relative to demographics, with the most extreme bias coming from middle-sized groups We show how these empirical observations can be generated by biases in preferences and biases in meetings We also illustrate some welfare implications of the model

853 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider an alternative explanation, which adds the hypothesis that people like to be perceived as fair, which has additional testable implications, the validity of which they confirm through new experiments.
Abstract: A norm of 50-50 division appears to have considerable force in a wide range of economic environments, both in the real world and in the laboratory. Even in settings where one party unilaterally determines the allocation of a prize (the dictator game), many subjects voluntarily cede exactly half to another individual. The hypothesis that people care about fairness does not by itself account for key experimental patterns. We consider an alternative explanation, which adds the hypothesis that people like to be perceived as fair. The properties of equilibria for the resulting signaling game correspond closely to laboratory observations. The theory has additional testable implications, the validity of which we confirm through new experiments.

733 citations


Journal ArticleDOI
TL;DR: For example, this article used a controlled experiment to explore whether there are gender differences in selecting into competitive environments across two distinct societies: the Maasai in Tanzania and the Khasi in India.
Abstract: We use a controlled experiment to explore whether there are gender differences in selecting into competitive environments across two distinct societies: the Maasai in Tanzania and the Khasi in India. One unique aspect of these societies is that the Maasai represent a textbook example of a patriarchal society, whereas the Khasi are matrilineal. Similar to the extant evidence drawn from experiments executed in Western cultures, Maasai men opt to compete at roughly twice the rate as Maasai women. Interestingly, this result is reversed among the Khasi, where women choose the competitive environment more often than Khasi men, and even choose to compete weakly more often than Maasai men. These results provide insights into the underpinnings of the factors hypothesized to be determinants of the observed gender differences in selecting into competitive environments.

719 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss the failure of the canonical search and matching model to match the cyclical volatility in the job finding rate and show that job creation in the model is influenced by wages in new matches.
Abstract: I discuss the failure of the canonical search and matching model to match the cyclical volatility in the job finding rate. I show that job creation in the model is influenced by wages in new matches. I summarize microeconometric evidence and find that wages in new matches are volatile and consistent with the model's key predictions. Therefore, explanations of the unemployment volatility puzzle have to preserve the cyclical volatility of wages. I discuss a modification of the model, based on fixed matching costs, that can increase cyclical unemployment volatility and is consistent with wage flexibility in new matches.

612 citations


ReportDOI
TL;DR: In this article, the authors develop modeling and inference tools for counterfactual distributions based on regression methods, which can be used to test functional hypotheses such as no-effect, positive effect, or stochastic dominance.
Abstract: Counterfactual distributions are important ingredients for policy analysis and decomposition analysis in empirical economics. In this article, we develop modeling and inference tools for counterfactual distributions based on regression methods. The counterfactual scenarios that we consider consist of ceteris paribus changes in either the distribution of covariates related to the outcome of interest or the conditional distribution of the outcome given covariates. For either of these scenarios, we derive joint functional central limit theorems and bootstrap validity results for regression-based estimators of the status quo and counterfactual outcome distributions. These results allow us to construct simultaneous confidence sets for function-valued effects of the counterfactual changes, including the effects on the entire distribution and quantile functions of the outcome as well as on related functionals. These confidence sets can be used to test functional hypotheses such as no-effect, positive effect, or stochastic dominance. Our theory applies to general counterfactual changes and covers the main regression methods including classical, quantile, duration, and distribution regressions. We illustrate the results with an empirical application to wage decompositions using data for the United States. As a part of developing the main results, we introduce distribution regression as a comprehensive and flexible tool for modeling and estimating the entire conditional distribution. We show that distribution regression encompasses the Cox duration regression and represents a useful alternative to quantile regression. We establish functional central limit theorems and bootstrap validity results for the empirical distribution regression process and various related functionals. [PUBLICATION ABSTRACT]

547 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a method to estimate the payoff functions of players in complete information, static, discrete games, and investigate the empirical importance of firm heterogeneity as a determinant of market structure in the U.S airline industry.
Abstract: We provide a practical method to estimate the payoff functions of players in complete information, static, discrete games. With respect to the empirical literature on entry games originated by Bresnahan and Reiss (1990) and Berry (1992), the main novelty of our framework is to allow for general forms of heterogeneity across players without making equilibrium selection assumptions. We allow the effects that the entry of each individual airline has on the profits of its competitors, its “competitive effects,” to differ across airlines. The identified features of the model are sets of parameters (partial identification) such that the choice probabilities predicted by the econometric model are consistent with the empirical choice probabilities estimated from the data. We apply this methodology to investigate the empirical importance of firm heterogeneity as a determinant of market structure in the U.S. airline industry. We find evidence of heterogeneity across airlines in their profit functions. The competitive effects of large airlines (American, Delta, United) are different from those of low cost carriers and Southwest. Also, the competitive effect of an airline is increasing in its airport presence, which is an important measure of observable heterogeneity in the airline industry. Then we develop a policy experiment to estimate the effect of repealing the Wright Amendment on competition in markets out of the Dallas airports. We find that repealing theWright Amendment would increase the number of markets served out of Dallas Love.

469 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate the post-intervention effects of paying people to attend a gym a number of times during one month and find marked attendance increases after the intervention relative to attendance changes for the respective control groups.
Abstract: Can incentives be effective in encouraging the development of good habits? We investigate the post-intervention effects of paying people to attend a gym a number of times during one month. In two studies we find marked attendance increases after the intervention relative to attendance changes for the respective control groups. This is entirely driven by people who did not previously attend the gym on a regular basis. In our second study, we find improvements on health indicators such as weight, waist size, and pulse rate, suggesting the intervention led to a net increase in total physical activity rather than to a substitution away from nonincentivized ones. We argue that there is scope for financial intervention in habit formation, particularly in the area of health.

467 citations


ReportDOI
TL;DR: In this article, the authors used control variables to identify and estimate models with nonseparable, multidimensional disturbances, with instruments and disturbances independent and a reduced form that is strictly monotonic in a scalar disturbance.
Abstract: This paper uses control variables to identify and estimate models with nonseparable, multidimensional disturbances. Triangular simultaneous equations models are considered, with instruments and disturbances that are independent and a reduced form that is strictly monotonic in a scalar disturbance. Here it is shown that the conditional cumulative distribution function of the endogenous variable given the instruments is a control variable. Also, for any control variable, identification results are given for quantile, average, and policy effects. Bounds are given when a common support assumption is not satisfied. Estimators of identified objects and bounds are provided, and a demand analysis empirical example is given.

460 citations


Journal ArticleDOI
TL;DR: In this article, a test of the nullness of k 0 factors against the alternative that the number of factors is larger than k 0 but no larger than K 1 > k 0.
Abstract: In this paper we study high-dimensional time series that have the generalized dynamic factor structure. We develop a test of the null of k 0 factors against the alternative that the number of factors is larger than k 0 but no larger than k 1 > k 0 . Our test statistic equals max k0

Journal ArticleDOI
TL;DR: In this paper, a closed-form formula for players' equilibrium payoffs and analysis of player participation in all-pay auctions is provided. But this formula does not capture general asymmetries and sunk investments inherent in scenarios such as lobbying, competition for market power, labor market tournaments, and R&D races.
Abstract: This paper studies a class of games, " all-pay contests," which capture general asymmetries and sunk investments inherent in scenarios such as lobbying, competition for market power, labor-market tournaments, and R&D races. Players compete for one of several identical prizes by choosing a score. Conditional on winning or losing, it is weakly better to do so with a lower score. This formulation allows for differing production technologies, costs of capital, prior investments, attitudes toward risk, and conditional and unconditional investments, among others. I provide a closed-form formula for players' equilibrium payoffs and analyze player participation. A special case of contests is multiprize, complete-information all-pay auctions.

ReportDOI
TL;DR: In this paper, the authors examine the competition between a group of Internet retailers who operate in an environment where a price search engine plays a dominant role and show that for some products in this environment, the easy price search makes demand tremendously price sensitive.
Abstract: We examine the competition between a group of Internet retailers who operate in an environment where a price search engine plays a dominant role. We show that for some products in this environment, the easy price search makes demand tremendously price-sensitive. Retailers, though, engage in obfuscation—practices that frustrate consumer search or make it less damaging to firms—resulting in much less price sensitivity on some other products. We discuss several models of obfuscation and examine its effects on demand and markups empirically.

Journal ArticleDOI
TL;DR: In this paper, the authors study how trading frictions in asset markets affect the distribution of asset holdings, asset prices, efficiency, and standard measures of liquidity, and show that a reduction in trading friction leads to an increase in the dispersion of assets and trade volume.
Abstract: We study how trading frictions in asset markets affect the distribution of asset holdings, asset prices, efficiency, and standard measures of liquidity. To this end, we analyze the equilibrium and optimal allocations of a search-theoretic model of financial intermediation similar to Duffie, Gârleanu and Pedersen (2005). In contrast with the existing literature, the model we develop imposes no restrictions on asset holdings, so traders can accommodate frictions by varying their trading needs through changes in their asset positions. We find that this is a critical aspect of investor behavior in illiquid markets. A reduction in trading frictions leads to an increase in the dispersion of asset holdings and trade volume. Transaction costs and intermediaries’ incentives to make markets are nonmonotonic in trade frictions. With the entry of dealers, these nonmonotonicities give rise to an externality in liquidity provision that can lead to multiple equilibria. Tight spreads are correlated with large volume and short trading delays across equilibria. From a normative standpoint we show that the asset allocation across investors and the number of dealers are socially inefficient.

Journal ArticleDOI
TL;DR: In this article, the authors estimate the presence and importance of hidden information and hidden action problems in a consumer credit market using a new field experiment methodology, and they find strong evidence of moral hazard and weaker evidence of hidden-information problems.
Abstract: Information asymmetries are important in theory but difficult to identify in practice. We estimate the presence and importance of hidden information and hidden action problems in a consumer credit market using a new field experiment methodology. We randomized 58,000 direct mail offers to former clients of a major South African lender along three dimensions: (i) an initial “offer interest rate” featured on a direct mail solicitation; (ii) a “contract interest rate” that was revealed only after a borrower agreed to the initial offer rate; and (ii) a dynamic repayment incentive that was also a surprise and extended preferential pricing on future loans to borrowers who remained in good standing. These three randomizations, combined with complete knowledge of the lender's information set, permit identification of specific types of private information problems. Our setup distinguishes hidden information effects from selection on the offer rate (via unobservable risk and anticipated effort), from hidden action effects (via moral hazard in effort) induced by actual contract terms. We find strong evidence of moral hazard and weaker evidence of hidden information problems. A rough estimate suggests that perhaps 13% to 21% of default is due to moral hazard. Asymmetric information thus may help explain the prevalence of credit constraints even in a market that specializes in financing high-risk borrowers.

Journal ArticleDOI
TL;DR: In this paper, the effect of social connections between workers and managers on productivity in the workplace has been investigated, and it was shown that favoring connected workers is detrimental for the firm's overall performance.
Abstract: We present evidence on the effect of social connections between workers and managers on productivity in the workplace. To evaluate whether the existence of social connections is beneficial to the firm's overall performance, we explore how the effects of social connections vary with the strength of managerial incentives and worker's ability. To do so, we combine panel data on individual worker's productivity from personnel records with a natural field experiment in which we engineered an exogenous change in managerial incentives, from fixed wages to bonuses based on the average productivity of the workers managed. We find that when managers are paid fixed wages, they favor workers to whom they are socially connected irrespective of the worker's ability, but when they are paid performance bonuses, they target their effort toward high ability workers irrespective of whether they are socially connected to them or not. Although social connections increase the performance of connected workers, we find that favoring connected workers is detrimental for the firm's overall performance.

Journal ArticleDOI
TL;DR: In this article, the authors study asset prices in a two-agent macroeconomic model with two key features: limited stock market participation and heterogeneity in the elasticity of intertemporal substitution in consumption.
Abstract: I study asset prices in a two-agent macroeconomic model with two key features: limited stock market participation and heterogeneity in the elasticity of intertemporal substitution in consumption (EIS). The model is consistent with some prominent features of asset prices, such as a high equity premium, relatively smooth interest rates, procyclical stock prices, and countercyclical variation in the equity premium, its volatility, and in the Sharpe ratio. In this model, the risk-free asset market plays a central role by allowing non-stockholders (with low EIS) to smooth the fluctuations in their labor income. This process concentrates non-stockholders' labor income risk among a small group of stockholders, who then demand a high premium for bearing the aggregate equity risk. Furthermore, this mechanism is consistent with the very small share of aggregate wealth held by non-stockholders in the U.S. data, which has proved problematic for previous models with limited participation. I show that this large wealth inequality is also important for the model's ability to generate a countercyclical equity premium. When it comes to business cycle performance, the model's progress has been more limited: consumption is still too volatile compared to the data, whereas investment is still too smooth. These are important areas for potential improvement in this framework.

Journal ArticleDOI
TL;DR: In this article, the authors employ the same domain for preference and a closely related (but weaker) set of axioms to characterize preferences that use second-order beliefs (beliefs over probability measures).
Abstract: Anscombe and Aumann (1963) wrote a classic characterization of subjective expected utility theory. This paper employs the same domain for preference and a closely related (but weaker) set of axioms to characterize preferences that use second-order beliefs (beliefs over probability measures). Such preferences are of interest because they accommodate Ellsberg-type behavior.

ReportDOI
TL;DR: This work develops a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or equivalently, the value of a linear programming problem with a potentially infinite constraint set.
Abstract: We develop a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or, equivalently, the value of a linear programming problem with a potentially infinite constraint set. We show that many bounds characterizations in econometrics, for instance bounds on parameters under conditional moment inequalities, can be formulated as intersection bounds. Our approach is especially convenient for models comprised of a continuum of inequalities that are separable in parameters, and also applies to models with inequalities that are nonseparable in parameters. Since analog estimators for intersection bounds can be severely biased in finite samples, routinely underestimating the size of the identified set, we also offer a median-bias-corrected estimator of such bounds as a by-product of our inferential procedures. We develop theory for large sample inference based on the strong approximation of a sequence of series or kernel-based empirical processes by a sequence of �penultimate� Gaussian processes. These penultimate processes are generally not weakly convergent, and thus are non-Donsker. Our theoretical results establish that we can nonetheless perform asymptotically valid inference based on these processes. Our construction also provides new adaptive inequality/moment selection methods. We provide conditions for the use of nonparametric kernel and series estimators, including a novel result that establishes strong approximation for any general series estimator admitting linearization, which may be of independent interest.

Journal ArticleDOI
TL;DR: In this article, a new variance estimator for generalized empirical likelihood (GEL) is proposed, which is consistent under the usual asymptotics and is larger than usual and is consistent.
Abstract: Using many moment conditions can improve efficiency but makes the usual generalized method of moments (GMM) inferences inaccurate. Two-step GMM is biased. Generalized empirical likelihood (GEL) has smaller bias, but the usual standard errors are too small in instrumental variable settings. In this paper we give a new variance estimator for GEL that addresses this problem. It is consistent under the usual asymptotics and, under many weak moment asymptotics, is larger than usual and is consistent. We also show that the Kleibergen (2005) Lagrange multiplier and conditional likelihood ratio statistics are valid under many weak moments. In addition, we introduce a jackknife GMM estimator, but find that GEL is asymptotically more efficient under many weak moments. In Monte Carlo examples we find that t-statistics based on the new variance estimator have nearly correct size in a wide range of cases.

Journal ArticleDOI
TL;DR: This paper derived conditions for nonparametric identifiability of type probabilities and type-specific component distributions in finite mixture models of dynamic discrete choices under dierent assumptions on the Markov property, stationarity, and type invariance in the transition process.
Abstract: In dynamic discrete choice analysis, controlling for unobserved heterogeneity is an important issue, and finite mixture models provide flexible ways to account for it. This paper studies nonparametric identifiability of type probabilities and type-specific component distributions in finite mixture models of dynamic discrete choices. We derive sucient conditions for nonparametric identification for various finite mixture models of dynamic discrete choices used in applied work under dierent assumptions on the Markov property, stationarity, and type-invariance in the transition process. Three elements emerge as the important determinants of identification; the time-dimension of panel data, the number of values the covariates can take, and the heterogeneity of the response of dierent types to changes in the co

Journal ArticleDOI
TL;DR: In this paper, a direct revelation mechanism for eliciting agents' subjective probabilities is described, and the game induced by the mechanism has a dominant strategy equilibrium in which the players reveal their subjective probabilities.
Abstract: This paper describes a direct revelation mechanism for eliciting agents' subjective probabilities. The game induced by the mechanism has a dominant strategy equilibrium in which the players reveal their subjective probabilities.

Journal ArticleDOI
TL;DR: In this paper, a martingale theory for multiple priors is derived that extends the classical dynamic programming or Snell envelope approach to multiple prior, and a theory of optimal stopping under Knightian uncertainty is developed.
Abstract: We develop a theory of optimal stopping under Knightian uncertainty. A suitable martingale theory for multiple priors is derived that extends the classical dynamic programming or Snell envelope approach to multiple priors. We relate the multiple prior theory to the classical setup via a minimax theorem. In a multiple prior version of the classical model of independent and identically distributed random variables, we discuss several examples from microeconomics, operation research, and finance. For monotone payoffs, the worst-case prior can be identified quite easily with the help of stochastic dominance arguments. For more complex payoff structures like barrier options, model ambiguity leads to stochastic changes in the worst-case beliefs.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the local-constancy approximation can yield asymptotic properties (consistency, normality) that are correct subject to an ex post adjustment involving asymPTotic likelihood ratios.
Abstract: The econometric literature of high frequency data often relies on moment estimators which are derived from assuming local constancy of volatility and related quantities. We here study this local-constancy approximation as a general approach to estimation in such data. We show that the technique yields asymptotic properties (consistency, normality) that are correct subject to an ex post adjustment involving asymptotic likelihood ratios. These adjustments are derived and documented. Several examples of estimation are provided: powers of volatility, leverage effect, and integrated betas. The first order approximations based on local constancy can be over the period of one observation or over blocks of successive observations. It has the advantage of gaining in transparency in defining and analyzing estimators. The theory relies heavily on the interplay between stable convergence and measure change, and on asymptotic expansions for martingales.

Journal ArticleDOI
TL;DR: In this article, the first-price auction model with risk-averse bidders was studied, and the authors established the nonparametric identification of the Bidders' utility function under exclusion restrictions.
Abstract: This paper studies the nonparametric identification of the first-price auction model with risk averse bidders within the private value paradigm. First, we show that the benchmark model is nonindentified from observed bids. We also derive the restrictions imposed by the model on observables and show that these restrictions are weak. Second, we establish the nonparametric identification of the bidders' utility function under exclusion restrictions. Our primary exclusion restriction takes the form of an exogenous bidders' participation, leading to a latent distribution of private values that is independent of the number of bidders. The key idea is to exploit the property that the bid distribution varies with the number of bidders while the private value distribution does not. We then extend these results to endogenous bidders' participation when the exclusion restriction takes the form of instruments that do not affect the bidders' private value distribution. Though derived for a benchmark model, our results extend to more general cases such as a binding reserve price, affiliated private values, and asymmetric bidders. Last, possible estimation methods are proposed.

Journal ArticleDOI
TL;DR: In this paper, the authors incorporate the retirement decision into a version of Ben-Porath's (1967) model and find that a necessary condition for this causal relationship to hold is that increased life expectancy will also increase lifetime labor supply.
Abstract: Conventional wisdom suggests that increased life expectancy had a key role in causing a rise in investment in human capital. I incorporate the retirement decision into a version of Ben-Porath's (1967) model and find that a necessary condition for this causal relationship to hold is that increased life expectancy will also increase lifetime labor supply. I then show that this condition does not hold for American men born between 1840 and 1970 and for the American population born between 1890 and 1970. The data suggest similar patterns in Western Europe. I end by discussing the implications of my findings for the debate on the fundamental causes of long-run growth.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the independent and identically distributed (i.i.d.) bootstrap and the wild bootstrap (WB), and prove their first-order asymptotic validity under general assumptions on the log-price process that allow for drift and leverage effects.
Abstract: We propose bootstrap methods for a general class of nonlinear transformations of realized volatility which includes the raw version of realized volatility and its logarithmic transformation as special cases. We consider the independent and identically distributed (i.i.d.) bootstrap and the wild bootstrap (WB), and prove their first-order asymptotic validity under general assumptions on the log-price process that allow for drift and leverage effects. We derive Edgeworth expansions in a simpler model that rules out these effects. The i.i.d. bootstrap provides a second-order asymptotic refinement when volatility is constant, but not otherwise. The WB yields a second-order asymptotic refinement under stochastic volatility provided we choose the external random variable used to construct the WB data appropriately. None of these methods provides third-order asymptotic refinements. Both methods improve upon the first-order asymptotic theory in finite samples.

ReportDOI
TL;DR: The authors used indirect inference to estimate a joint model of earnings, employment, job changes, wage rates, and work hours over a career, and measured the relative contributions of the shocks to the variance of earnings in a given year and over a lifetime.
Abstract: In this paper, we use indirect inference to estimate a joint model of earnings, employment, job changes, wage rates, and work hours over a career. We use the model to address a number of important questions in labor economics, including the source of the experience profile of wages, the response of job changes to outside wage offers, and the effects of seniority on job changes. We also study the dynamic response of wage rates, hours, and earnings to various shocks, and measure the relative contributions of the shocks to the variance of earnings in a given year and over a lifetime. We find that human capital accounts for most of the growth of earnings over a career, although job seniority and job mobility also play significant roles. Unemployment shocks have a large impact on earnings in the short run, as well as a substantial long-term effect that operates through the wage rate. Shocks associated with job changes and unemployment make a large contribution to the variance of career earnings and operate mostly through the job-specific error components of wages and hours.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a model of decision under ambiguity deemed vector expected utility, or VEU, where an uncertain prospect, or act, is assessed according to a baseline expected-utility evaluation and an adjustment that reflects the individual's perception of ambiguity and her attitudes toward it.
Abstract: This paper proposes a model of decision under ambiguity deemed vector expected utility, or VEU. In this model, an uncertain prospect, or Savage act, is assessed according to (a) a baseline expected-utility evaluation ,a nd (b) anadjustment that reflects the individual’s perception of ambiguity and her attitudes toward it. The adjustment is itself a function of the act’s exposure to distinct sources of ambiguity, as well as its variability. The key elements of the VEU model are a baseline probability and a collection of random variables, or adjustment factors, which represent acts exposed to distinct ambiguity sources and also reflect complementarities among ambiguous events. The adjustment to the baseline expected-utility evaluation of an act is a function of the covariance of its utility profile with each adjustment factor, which reflects exposure to the corresponding ambiguity source. A behavioral characterization of the VEU model is provided. Furthermore, an updating rule for VEU preferences is proposed and characterized. The suggested updating rule facilitates the analysis of sophisticated dynamic choice with VEU preferences.

Journal ArticleDOI
TL;DR: In this article, structural nonparametric estimation of a structural cointegrating regression model is studied, where the regressor and the dependent variable are jointly dependent and contemporaneously correlated.
Abstract: Nonparametric estimation of a structural cointegrating regression model is studied. As in the standard linear cointegrating regression model, the regressor and the dependent variable are jointly dependent and contemporaneously correlated. In nonparametric estimation problems, joint dependence is known to be a major complication that affects identification, induces bias in conventional kernel estimates, and frequently leads to ill-posed inverse problems. In functional cointegrating regressions where the regressor is an integrated or near-integrated time series, it is shown here that inverse and ill-posed inverse problems do not arise. Instead, simple nonparametric kernel estimation of a structural nonparametric cointegrating regression is consistent and the limit distribution theory is mixed normal, giving straightforward asymptotics that are useable in practical work. It is further shown that use of augmented regression, as is common in linear cointegration modeling to address endogeneity, does not lead to bias reduction in nonparametric regression, but there is an asymptotic gain in variance reduction. The results provide a convenient basis for inference in structural nonparametric regression with nonstationary time series when there is a single integrated or near-integrated regressor. The methods may be applied to a range of empirical models where functional estimation of cointegrating relations is required.