scispace - formally typeset
Search or ask a question
Book ChapterDOI

Chapter 71 Econometric Evaluation of Social Programs, Part II: Using the Marginal Treatment Effect to Organize Alternative Econometric Estimators to Evaluate Social Programs, and to Forecast their Effects in New Environments ⁎

TL;DR: The marginal treatment effect (MTE) as mentioned in this paper is a choice-theoretic parameter that can be interpreted as a willingness to pay parameter for persons at a margin of indifference between participating in an activity or not.
Abstract: This chapter uses the marginal treatment effect (MTE) to unify and organize the econometric literature on the evaluation of social programs. The marginal treatment effect is a choice-theoretic parameter that can be interpreted as a willingness to pay parameter for persons at a margin of indifference between participating in an activity or not. All of the conventional treatment parameters as well as the more economically motivated treatment effects can be generated from a baseline marginal treatment effect. All of the estimation methods used in the applied evaluation literature, such as matching, instrumental variables, regression discontinuity methods, selection and control function methods, make assumptions about the marginal treatment effect which we exposit. Models for multiple outcomes are developed. Empirical examples of the leading methods are presented. Methods are presented for bounding treatment effects in partially identified models, when the marginal treatment effect is known only over a limited support. We show how to use the marginal treatment in econometric cost benefit analysis, in defining limits of policy experiments, in constructing the average marginal treatment effect, and in forecasting the effects of programs in new environments.
Citations
More filters
Journal ArticleDOI
TL;DR: In the last two decades, much research has been done on the econometric and statistical analysis of such causal effects as discussed by the authors, which has reached a level of maturity that makes it an important tool in many areas of empirical research in economics, including labor economics, public finance, development economics, industrial organization, and other areas in empirical microeconomics.
Abstract: Many empirical questions in economics and other social sciences depend on causal effects of programs or policies. In the last two decades, much research has been done on the econometric and statistical analysis of such causal effects. This recent theoreti- cal literature has built on, and combined features of, earlier work in both the statistics and econometrics literatures. It has by now reached a level of maturity that makes it an important tool in many areas of empirical research in economics, including labor economics, public finance, development economics, industrial organization, and other areas of empirical microeconomics. In this review, we discuss some of the recent developments. We focus primarily on practical issues for empirical research- ers, as well as provide a historical overview of the area and give references to more technical research.

3,175 citations

Posted Content
TL;DR: In this paper, the authors investigated conditions sufficient for identification of average treatment effects using instrumental variables and showed that the existence of valid instruments is not sufficient to identify any meaningful average treatment effect.
Abstract: We investigate conditions sufficient for identification of average treatment effects using instrumental variables. First we show that the existence of valid instruments is not sufficient to identify any meaningful average treatment effect. We then establish that the combination of an instrument and a condition on the relation between the instrument and the participation status is sufficient for identification of a local average treatment effect for those who can be induced to change their participation status by changing the value of the instrument. Finally we derive the probability limit of the standard IV estimator under these conditions. It is seen to be a weighted average of local average treatment effects.

3,154 citations

Journal ArticleDOI
TL;DR: A dynamic factor model is estimated to solve the problem of endogeneity of inputs and multiplicity of inputs relative to instruments and the role of family environments in shaping these skills at different stages of the life cycle of the child.
Abstract: This paper estimates models of the evolution of cognitive and noncognitive skills and explores the role of family environments in shaping these skills at different stages of the life cycle of the child. Central to this analysis is identification of the technology of skill formation. We estimate a dynamic factor model to solve the problem of endogeneity of inputs and multiplicity of inputs relative to instruments. We identify the scale of the factors by estimating their effects on adult outcomes. In this fashion we avoid reliance on test scores and changes in test scores that have no natural metric. Parental investments are generally more effective in raising noncognitive skills. Noncognitive skills promote the formation of cognitive skills but, in most specifications of our model, cognitive skills do not promote the formation of noncognitive skills. Parental inputs have different effects at different stages of the child’s life cycle with cognitive skills affected more at early ages and noncognitive skills affected more at later ages.

1,636 citations


Cites background from "Chapter 71 Econometric Evaluation o..."

  • ...52See Heckman and Robb (1985), Heckman and Vytlacil (2007) and Matzkin (2007) for a discussion of replacement functions....

    [...]

  • ...See Herrnstein and Murray (1994), Murnane, Willett, and Levy (1995), and Cawley, Heckman, and Vytlacil (2001). (2)See Heckman, Stixrud, and Urzua (2006), Borghans, Duckworth, Heckman, and ter Weel (2008) and the references they cite. See also the special issue of the Journal of Human Resources 43 (4), Fall 2008 on noncognitive skills. (3)See Cunha, Heckman, Lochner, and Masterov (2006) and Cunha and Heckman (2007, 2009). (4)This evidence is summarized in Knudsen, Heckman, Cameron, and Shonkoff (2006) and Heckman (2008). (5)See Shumway and Stoffer (1982) and Watson and Engle (1983) for early discussions of such models....

    [...]

  • ...See Herrnstein and Murray (1994), Murnane, Willett, and Levy (1995), and Cawley, Heckman, and Vytlacil (2001)....

    [...]

  • ...See Herrnstein and Murray (1994), Murnane, Willett, and Levy (1995), and Cawley, Heckman, and Vytlacil (2001). (2)See Heckman, Stixrud, and Urzua (2006), Borghans, Duckworth, Heckman, and ter Weel (2008) and the references they cite. See also the special issue of the Journal of Human Resources 43 (4), Fall 2008 on noncognitive skills. (3)See Cunha, Heckman, Lochner, and Masterov (2006) and Cunha and Heckman (2007, 2009)....

    [...]

  • ...See Herrnstein and Murray (1994), Murnane, Willett, and Levy (1995), and Cawley, Heckman, and Vytlacil (2001). (2)See Heckman, Stixrud, and Urzua (2006), Borghans, Duckworth, Heckman, and ter Weel (2008) and the references they cite....

    [...]

ReportDOI
TL;DR: In this paper, the authors formalize the concepts of self-productivity and complementarity of human capital investments and use them to explain the evidence on skill formation, and provide a theoretical framework for interpreting the evidence from a vast empirical literature, for guiding the next generation of empirical studies, and for formulating policy.
Abstract: This paper presents economic models of child development that capture the essence of recent findings from the empirical literature on skill formation. The goal of this essay is to provide a theoretical framework for interpreting the evidence from a vast empirical literature, for guiding the next generation of empirical studies, and for formulating policy. Central to our analysis is the concept that childhood has more than one stage. We formalize the concepts of self-productivity and complementarity of human capital investments and use them to explain the evidence on skill formation. Together, they explain why skill begets skill through a multiplier process. Skill formation is a life cycle process. It starts in the womb and goes on throughout life. Families play a role in this process that is far more important than the role of schools. There are multiple skills and multiple abilities that are important for adult success. Abilities are both inherited and created, and the traditional debate about nature versus nurture is scientiÞcally obsolete. Human capital investment exhibits both self-productivity and complementarity. Skill attainment at one stage of the life cycle raises skill attainment at later stages of the life cycle (self-productivity). Early investment facilitates the productivity of later investment (complementarity). Early investments are not productive if they are not followed up by later investments (another aspect of complementarity). This complementarity explains why there is no equity-efficiency trade-off for early investment. The returns to investing early in the life cycle are high. Remediation of inadequate early investments is difficult and very costly as a consequence of both self-productivity and complementarity.

1,585 citations

Posted ContentDOI
TL;DR: This article explored the interface between personality psychology and economics and examined the predictive power of personality and the stability of personality traits over the life cycle, and developed simple analytical frameworks for interpreting the evidence in personality psychology.
Abstract: This paper explores the interface between personality psychology and economics. We examine the predictive power of personality and the stability of personality traits over the life cycle. We develop simple analytical frameworks for interpreting the evidence in personality psychology and suggest promising avenues for future research.

1,206 citations

References
More filters
Journal ArticleDOI
TL;DR: The authors discusses the central role of propensity scores and balancing scores in the analysis of observational studies and shows that adjustment for the scalar propensity score is sufficient to remove bias due to all observed covariates.
Abstract: : The results of observational studies are often disputed because of nonrandom treatment assignment. For example, patients at greater risk may be overrepresented in some treatment group. This paper discusses the central role of propensity scores and balancing scores in the analysis of observational studies. The propensity score is the (estimated) conditional probability of assignment to a particular treatment given a vector of observed covariates. Both large and small sample theory show that adjustment for the scalar propensity score is sufficient to remove bias due to all observed covariates. Applications include: matched sampling on the univariate propensity score which is equal percent bias reducing under more general conditions than required for discriminant matching, multivariate adjustment by subclassification on balancing scores where the same subclasses are used to estimate treatment effects for all outcome variables and in all subpopulations, and visual representation of multivariate adjustment by a two-dimensional plot. (Author)

23,744 citations

Journal ArticleDOI
TL;DR: In this article, the null hypothesis of no misspecification was used to show that an asymptotically efficient estimator must have zero covariance with its difference from a consistent but asymptonically inefficient estimator, and specification tests for a number of model specifications in econometrics.
Abstract: Using the result that under the null hypothesis of no misspecification an asymptotically efficient estimator must have zero asymptotic covariance with its difference from a consistent but asymptotically inefficient estimator, specification tests are devised for a number of model specifications in econometrics. Local power is calculated for small departures from the null hypothesis. An instrumental variable test as well as tests for a time series cross section model and the simultaneous equation model are presented. An empirical model provides evidence that unobserved individual factors are present which are not orthogonal to the included right-hand-side variable in a common econometric specification of an individual wage equation.

16,198 citations


"Chapter 71 Econometric Evaluation o..." refers background or methods in this paper

  • ...See Todd (1999, 2007, 2008) for software and extensive discussion of the mechanics of matching....

    [...]

  • ...To see the consequences of this violation in a regression setting, use Y = Y0 + D(Y1 − Y0) and take conditional 3 See the discussion in Section 8....

    [...]

  • ...See Heckman (1992) for a discussion of randomization bias in economics....

    [...]

  • ...More recent work analyzes distributions of outcomes [e.g., Aakvik, Heckman and Vytlacil (2005), Carneiro, Hansen and Heckman (2003)]....

    [...]

  • ...See Powell (1994) for a survey....

    [...]

Book
01 Jan 1979

11,977 citations


"Chapter 71 Econometric Evaluation o..." refers background in this paper

  • ...They are manifestations of a more general problem termed “Hawthorne effects” that arise from observing any population [see Campbell and Stanley (1963), Cook and Campbell (1979)]....

    [...]

Book
01 Jan 1963
TL;DR: A survey drawn from social science research which deals with correlational, ex post facto, true experimental, and quasi-experimental designs and makes methodological recommendations is presented in this article.
Abstract: A survey drawn from social-science research which deals with correlational, ex post facto, true experimental, and quasi-experimental designs and makes methodological recommendations. Bibliogs.

10,916 citations


"Chapter 71 Econometric Evaluation o..." refers methods or result in this paper

  • ...38 In an application to wage equations, Card (1999, 2001) interprets the LATE estimator as identifying returns to marginal persons. Heckman (1996) notes that the actual margin of choice selected by the IV estimator is not identified by the instrument. It is unclear as to which segment of the population the return estimated by LATE applies. If the analyst is interested in knowing the average response ( β̄ ) , the effect of the policy on the outcomes of countries that adopt it (E(β | D = 1)) or the effect of the policy if a particular country adopts it, there is no guarantee that the IV estimator comes any closer to the desired target than the OLS estimator and indeed it may be more biased than OLS. Because different instruments define different parameters, having a wealth of different strong instruments does not improve the precision of the estimate of any particular parameter. This is in stark contrast with the traditional model with β ⊥ D. In that case, all valid instruments identify β̄. The Durbin (1954) – Wu (1973) – Hausman (1978) test for the validity of extra instruments applies to the traditional model. In the more general case with essential heterogeneity, because different instruments estimate different parameters, no clear inference emerges from such specification tests. When there are more than two distinct values of Z, Imbens and Angrist draw on the analysis of Yitzhaki (1989), which was refined in Yitzhaki (1996) and Yitzhaki and Schechtman (2004), to produce a weighted average of pairwise LATE parameters where the scalars Z are ordered to define the LATE parameter. In this case, IV is a weighted average of LATE parameters with nonnegative weights.39 Imbens and Angrist generalize this result to the case of vector Z assuming that instruments are monotonic functions of the probability of selection. Heckman and Vytlacil (1999, 2001b, 2005), Heckman, Urzua and Vytlacil (2006) and Carneiro, Heckman and Vytlacil (2006) generalize the analysis of Imbens and Angrist (1994) in several ways and we report their results in this chapter....

    [...]

  • ...38 In an application to wage equations, Card (1999, 2001) interprets the LATE estimator as identifying returns to marginal persons. Heckman (1996) notes that the actual margin of choice selected by the IV estimator is not identified by the instrument....

    [...]

  • ...38 In an application to wage equations, Card (1999, 2001) interprets the LATE estimator as identifying returns to marginal persons. Heckman (1996) notes that the actual margin of choice selected by the IV estimator is not identified by the instrument. It is unclear as to which segment of the population the return estimated by LATE applies. If the analyst is interested in knowing the average response ( β̄ ) , the effect of the policy on the outcomes of countries that adopt it (E(β | D = 1)) or the effect of the policy if a particular country adopts it, there is no guarantee that the IV estimator comes any closer to the desired target than the OLS estimator and indeed it may be more biased than OLS. Because different instruments define different parameters, having a wealth of different strong instruments does not improve the precision of the estimate of any particular parameter. This is in stark contrast with the traditional model with β ⊥ D. In that case, all valid instruments identify β̄. The Durbin (1954) – Wu (1973) – Hausman (1978) test for the validity of extra instruments applies to the traditional model. In the more general case with essential heterogeneity, because different instruments estimate different parameters, no clear inference emerges from such specification tests. When there are more than two distinct values of Z, Imbens and Angrist draw on the analysis of Yitzhaki (1989), which was refined in Yitzhaki (1996) and Yitzhaki and Schechtman (2004), to produce a weighted average of pairwise LATE parameters where the scalars Z are ordered to define the LATE parameter....

    [...]

  • ...38 In an application to wage equations, Card (1999, 2001) interprets the LATE estimator as identifying returns to marginal persons. Heckman (1996) notes that the actual margin of choice selected by the IV estimator is not identified by the instrument. It is unclear as to which segment of the population the return estimated by LATE applies. If the analyst is interested in knowing the average response ( β̄ ) , the effect of the policy on the outcomes of countries that adopt it (E(β | D = 1)) or the effect of the policy if a particular country adopts it, there is no guarantee that the IV estimator comes any closer to the desired target than the OLS estimator and indeed it may be more biased than OLS. Because different instruments define different parameters, having a wealth of different strong instruments does not improve the precision of the estimate of any particular parameter. This is in stark contrast with the traditional model with β ⊥ D. In that case, all valid instruments identify β̄. The Durbin (1954) – Wu (1973) – Hausman (1978) test for the validity of extra instruments applies to the traditional model....

    [...]