scispace - formally typeset
Search or ask a question
Author

Jonathan N. Katz

Bio: Jonathan N. Katz is an academic researcher from California Institute of Technology. The author has contributed to research in topics: Voting & Generalized least squares. The author has an hindex of 32, co-authored 68 publications receiving 13069 citations. Previous affiliations of Jonathan N. Katz include University of Chicago & Carnegie Mellon University.


Papers
More filters
Journal ArticleDOI
TL;DR: The generalized least squares approach of Parks produces standard errors that lead to extreme overconfidence, often underestimating variability by 50% or more, and a new method is offered that is both easier to implement and produces accurate standard errors.
Abstract: We examine some issues in the estimation of time-series cross-section models, calling into question the conclusions of many published studies, particularly in the field of comparative political economy. We show that the generalized least squares approach of Parks produces standard errors that lead to extreme overconfidence, often underestimating variability by 50% or more. We also provide an alternative estimator of the standard errors that is correct when the error structures show complications found in this type of model. Monte Carlo analysis shows that these “panel-corrected standard errors” perform well. The utility of our approach is demonstrated via a reanalysis of one “social democratic corporatist” model.

5,670 citations

Journal ArticleDOI
TL;DR: In this article, a simple diagnostic for temporal dependence and a simple remedy based on the idea that binary dependent variable (BTSCS) data are identical to grouped duration data is proposed.
Abstract: Researchers typically analyze time-series-cross-section data with a binary dependent variable (BTSCS) using ordinary logit or probit. However, BTSCS observations are likely to violate the independence assumption of the ordinary logit or probit statistical model. It is well known that if the observations are temporally related that the results of an ordinary logit or probit analysis may be misleading. In this paper, we provide a simple diagnostic for temporal dependence and a simple remedy. Our remedy is based on the idea that BTSCS data are identical to grouped duration data. This remedy does not require the BTSCS analyst to acquire any further methodological skills, and it can be easily implemented in any standard statistical software package. While our approach is suitable for any type of BTSCS data, we provide examples and applications from the field of International Relations, where BTSCS data are frequently used. We use our methodology to reassess Oneal and Russett's (1997) findings regarding the relationship between economic interdependence, democracy, and peace. Our analyses show that (1) their finding that economic interdependence is associated with peace is an artifact of their failure to account for temporal dependence yet (2) their finding that democracy inhibits conflict is upheld even taking duration dependence into account.

2,329 citations

Journal ArticleDOI
TL;DR: In this paper, a lagged dependent variable approach was proposed for analyzing time-series-cross-section data, which makes it easier for researchers to examine dynamics and allows for natural generalizations in a manner that the serially correlated inequalities approach does not.
Abstract: In a previous article we showed that ordinary least squares with panel corrected standard errors is superior to the Parks generalized least squares approach to the estimation of time-series-cross-section models. In this article we compare our proposed method with another leading technique, Kmenta's "cross-sectionally heteroskedastic and timewise autocorrelated" model. This estimator uses generalized least squares to correct for both panel heteroskedasticity and temporally correlated errors. We argue that it is best to model dynamics via a lagged dependent variable rather than via serially correlated errors. The lagged dependent variable approach makes it easier for researchers to examine dynamics and allows for natural generalizations in a manner that the serially correlated errors approach does not. We also show that the generalized least squares correction for panel heteroskedasticity is, in general, no improvement over ordinary least squares and is, in the presence of parameter heterogeneity, inferior to it. In the conclusion we present a unified method for analyzing time-series-cross-section data.

963 citations

Journal ArticleDOI
TL;DR: A compositionally uniform interpolymer is compounded with at least one amide additive of the formula R1-CO-NH-R2 in which R1 is selected from saturatedAlkyl groups having from 13 to 25 carbon atoms and mono-olefinically unsaturated alkylgroups having from 17 to 23 carbon atoms.
Abstract: This article deals with a variety of dynamic issues in the analysis of time-series–cross-section (TSCS) data. Although the issues raised are general, we focus on applications to comparative political economy, which frequently uses TSCS data. We begin with a discussion of specification and lay out the theoretical differences implied by the various types of dynamic models that can be estimated. It is shown that there is nothing pernicious in using a lagged dependent variable and that all dynamic models either implicitly or explicitly have such a variable; the differences between the models relate to assumptions about the speeds of adjustment of measured and unmeasured variables. When adjustment is quick, it is hard to differentiate between the various models; with slower speeds of adjustment, the various models make sufficiently different predictions that they can be tested against each other. As the speed of adjustment gets slower and slower, specification (and estimation) gets more and more tricky. We the...

597 citations

Journal ArticleDOI
TL;DR: In this paper, the authors use a simple two-equation model, estimated by ordinary least squares regression, to analyze U.S. House election data from 1948 to 1990 and find that most of the increase in the overall incumbency advantage was driven principally by increases in the quality effect.
Abstract: Theory: A simple rational entry argument suggests that the value of incumbency consists not just of a direct effect, reflecting the value of resources (such as staff) attached to legislative office, but also of an indirect effect, reflecting the fact that stronger challengers are less likely to contest incumbent-held seats. The indirect effect is the product of a scare-off effect-the ability of incumbents to scare off high-quality challengers-and a quality effect-reflecting how much electoral advantage a party accrues when it has an experienced rather than an inexperienced candidate. Hypothesis: The growth of the overall incumbency advantage was driven principally by increases in the quality effect. Methods: We use a simple two-equation model, estimated by ordinary least-squares regression, to analyze U.S. House election data from 1948 to 1990. Results: Most of the increase in the incumbency advantage, at least down to 1980, came through increases in the quality effect (i.e., the advantage to the incumbent party of having a low-quality challenger). This suggests that the task for those wishing to explain the growth in the vote-denominated incumbency advantage is to explain why the quality effect grew. It also suggests that resource-based explanations of the growth in the incumbency advantage cannot provide a full explanation.

445 citations


Cited by
More filters
Book
28 Apr 2021
TL;DR: In this article, the authors proposed a two-way error component regression model for estimating the likelihood of a particular item in a set of data points in a single-dimensional graph.
Abstract: Preface.1. Introduction.1.1 Panel Data: Some Examples.1.2 Why Should We Use Panel Data? Their Benefits and Limitations.Note.2. The One-way Error Component Regression Model.2.1 Introduction.2.2 The Fixed Effects Model.2.3 The Random Effects Model.2.4 Maximum Likelihood Estimation.2.5 Prediction.2.6 Examples.2.7 Selected Applications.2.8 Computational Note.Notes.Problems.3. The Two-way Error Component Regression Model.3.1 Introduction.3.2 The Fixed Effects Model.3.3 The Random Effects Model.3.4 Maximum Likelihood Estimation.3.5 Prediction.3.6 Examples.3.7 Selected Applications.Notes.Problems.4. Test of Hypotheses with Panel Data.4.1 Tests for Poolability of the Data.4.2 Tests for Individual and Time Effects.4.3 Hausman's Specification Test.4.4 Further Reading.Notes.Problems.5. Heteroskedasticity and Serial Correlation in the Error Component Model.5.1 Heteroskedasticity.5.2 Serial Correlation.Notes.Problems.6. Seemingly Unrelated Regressions with Error Components.6.1 The One-way Model.6.2 The Two-way Model.6.3 Applications and Extensions.Problems.7. Simultaneous Equations with Error Components.7.1 Single Equation Estimation.7.2 Empirical Example: Crime in North Carolina.7.3 System Estimation.7.4 The Hausman and Taylor Estimator.7.5 Empirical Example: Earnings Equation Using PSID Data.7.6 Extensions.Notes.Problems.8. Dynamic Panel Data Models.8.1 Introduction.8.2 The Arellano and Bond Estimator.8.3 The Arellano and Bover Estimator.8.4 The Ahn and Schmidt Moment Conditions.8.5 The Blundell and Bond System GMM Estimator.8.6 The Keane and Runkle Estimator.8.7 Further Developments.8.8 Empirical Example: Dynamic Demand for Cigarettes.8.9 Further Reading.Notes.Problems.9. Unbalanced Panel Data Models.9.1 Introduction.9.2 The Unbalanced One-way Error Component Model.9.3 Empirical Example: Hedonic Housing.9.4 The Unbalanced Two-way Error Component Model.9.5 Testing for Individual and Time Effects Using Unbalanced Panel Data.9.6 The Unbalanced Nested Error Component Model.Notes.Problems.10. Special Topics.10.1 Measurement Error and Panel Data.10.2 Rotating Panels.10.3 Pseudo-panels.10.4 Alternative Methods of Pooling Time Series of Cross-section Data.10.5 Spatial Panels.10.6 Short-run vs Long-run Estimates in Pooled Models.10.7 Heterogeneous Panels.Notes.Problems.11. Limited Dependent Variables and Panel Data.11.1 Fixed and Random Logit and Probit Models.11.2 Simulation Estimation of Limited Dependent Variable Models with Panel Data.11.3 Dynamic Panel Data Limited Dependent Variable Models.11.4 Selection Bias in Panel Data.11.5 Censored and Truncated Panel Data Models.11.6 Empirical Applications.11.7 Empirical Example: Nurses' Labor Supply.11.8 Further Reading.Notes.Problems.12. Nonstationary Panels.12.1 Introduction.12.2 Panel Unit Roots Tests Assuming Cross-sectional Independence.12.3 Panel Unit Roots Tests Allowing for Cross-sectional Dependence.12.4 Spurious Regression in Panel Data.12.5 Panel Cointegration Tests.12.6 Estimation and Inference in Panel Cointegration Models.12.7 Empirical Example: Purchasing Power Parity.12.8 Further Reading.Notes.Problems.References.Index.

10,363 citations

Journal ArticleDOI
TL;DR: A ordered sequence of events or observations having a time component is called as a time series, and some good examples are daily opening and closing stock prices, daily humidity, temperature, pressure, annual gross domestic product of a country and so on.
Abstract: Preface1Difference Equations12Lag Operators253Stationary ARMA Processes434Forecasting725Maximum Likelihood Estimation1176Spectral Analysis1527Asymptotic Distribution Theory1808Linear Regression Models2009Linear Systems of Simultaneous Equations23310Covariance-Stationary Vector Processes25711Vector Autoregressions29112Bayesian Analysis35113The Kalman Filter37214Generalized Method of Moments40915Models of Nonstationary Time Series43516Processes with Deterministic Time Trends45417Univariate Processes with Unit Roots47518Unit Roots in Multivariate Time Series54419Cointegration57120Full-Information Maximum Likelihood Analysis of Cointegrated Systems63021Time Series Models of Heteroskedasticity65722Modeling Time Series with Changes in Regime677A Mathematical Review704B Statistical Tables751C Answers to Selected Exercises769D Greek Letters and Mathematical Symbols Used in the Text786Author Index789Subject Index792

10,011 citations

Book
01 Jan 2009

8,216 citations

Journal ArticleDOI
TL;DR: This article showed that the current prevalence of internal war is mainly the result of a steady accumulation of protracted conflicts since the 1950s and 1960s rather than a sudden change associated with a new, post-Cold War international system.
Abstract: An influential conventional wisdom holds that civil wars proliferated rapidly with the end of the Cold War and that the root cause of many or most of these has been ethnic and religious antagonisms. We show that the current prevalence of internal war is mainly the result of a steady accumulation of protracted conflicts since the 1950s and 1960s rather than a sudden change associated with a new, post-Cold War international system. We also find that after controlling for per capita income, more ethnically or religiously diverse countries have been no more likely to experience significant civil violence in this period. We argue for understanding civil war in this period in terms of insurgency or rural guerrilla warfare, a particular form of military practice that can be harnessed to diverse political agendas. The factors that explain which countries have been at risk for civil war are not their ethnic or religious characteristics but rather the conditions that favor insurgency. These include poverty—which marks financially and bureaucratically weak states and also favors rebel recruitment—political instability, rough terrain, and large populations.We wish to thank the many people who provided comments on earlier versions of this paper in a series of seminar presentations. The authors also gratefully acknowledge the support of the National Science Foundation (Grants SES-9876477 and SES-9876530); support from the Center for Advanced Study in the Behavioral Sciences with funds from the William and Flora Hewlett Foundation; valuable research assistance from Ebru Erdem, Nikolay Marinov, Quinn Mecham, David Patel, and TQ Shang; sharing of data by Paul Collier.

5,994 citations

Journal ArticleDOI
TL;DR: The generalized least squares approach of Parks produces standard errors that lead to extreme overconfidence, often underestimating variability by 50% or more, and a new method is offered that is both easier to implement and produces accurate standard errors.
Abstract: We examine some issues in the estimation of time-series cross-section models, calling into question the conclusions of many published studies, particularly in the field of comparative political economy. We show that the generalized least squares approach of Parks produces standard errors that lead to extreme overconfidence, often underestimating variability by 50% or more. We also provide an alternative estimator of the standard errors that is correct when the error structures show complications found in this type of model. Monte Carlo analysis shows that these “panel-corrected standard errors” perform well. The utility of our approach is demonstrated via a reanalysis of one “social democratic corporatist” model.

5,670 citations