scispace - formally typeset
Search or ask a question
Author

T. W. Anderson

Bio: T. W. Anderson is an academic researcher from Stanford University. The author has contributed to research in topics: Estimator & Autoregressive model. The author has an hindex of 52, co-authored 179 publications receiving 42299 citations. Previous affiliations of T. W. Anderson include Columbia University & Carnegie Mellon University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, it was shown that for any > 1, unless the distribution of the u's is independent, identically distributed with finite variance, the distribution has a limiting Cauchy distribution under the assumption that u's are not necessarily normally distributed.
Abstract: Let $x_t(t = 1, 2, \cdots)$ be defined recursively by \begin{equation*}\tag{1.1}x_t = ax_{t-1} + u_t,\quad t = 1, 2, \cdots,\end{equation*} where $x_0$ is a constant, $\varepsilon u_t = 0, \varepsilon u^2_t = \sigma^2$ and $\varepsilon u_tu_s = 0, t eq s$. ($\varepsilon$ denotes mathematical expectation.) An estimate of $\alpha$ based on $x_1, \cdots, x_T$ (which is the maximum likelihood estimate of $\alpha$ if the $u$'s are normally distributed) is \begin{equation*}\tag{1.2}\hat \alpha = \bigg(\sum^T_{t=1} x_tx_{t-1}\bigg)\bigg/\big(\sum^T_{t=1} x^2_{t-1}\bigg).\end{equation*} If $|\alpha| 0$. (See [2], Chapter II, for example.) If $|\alpha| > 1$, White [3] has shown $(\hat \alpha - \alpha)|\alpha|^T/(\alpha^2 - 1)$ has a limiting Cauchy distribution under the assumption that $x_0 = 0$ and the $u$'s are normally distributed; he has also found the distribution when $x_0 eq 0$. His results can be easily modified and restated in the following form $(\Sigma^T_{t=1} x^2_{t-1})^{\frac{1}{2}}(\hat \alpha - \alpha)$ has a limiting normal distribution if the $u$'s are normally distributed and if $|\alpha| eq 1$. Peculiarly, for $|\alpha| = 1$ this statistic has a limiting distribution which is not normal (and is not even symmetric for $x_0 = 0)$. One purpose of this paper is to characterize the limiting distributions for $|\alpha| > 1$ when the $u$'s are not necessarily normally distributed; it will be shown that for $|\alpha| > 1$ the results depend on the distribution of the $u$'s. Central limit theorems are not applicable. Secondly, the limiting distribution for $|\alpha| < 1$ will be shown to hold under the assumption that the $u$'s are independently, identically distributed with finite variance. This was conjectured by White.

376 citations

Journal ArticleDOI
TL;DR: In this article, the authors evaluated the effect of exposure to biologically equalized doses of UV radiation on in vivo immunization in humans and concluded that relevant and even subclinical levels of UV exposure have significant down modulatory effects on the ability of humans to generate a T-cell-mediated response to antigens introduced through irradiated skin.
Abstract: Increasing UVB radiation at the earth's surface might have adverse effects on in vivo immunologic responses in humans. We prospectively randomized subjects to test whether epicutaneous immunization is altered by prior administration of biologically equalized doses of UV radiation. Multiple doses of antigens on upper inner arm skin (UV protected) were used to elicit contact sensitivity responses, which were quantitated by measuring increases in skin thickness. If a dose of UVB sufficient to induce redness (erythemagenic) was administered to the immunization site prior to sensitization with dinitrochlorobenzene (DNCB), we noted a marked reduction in the degree of sensitization (P less than 0.0006) that was highly dose responsive (r = 0.98). Even suberythemagenic UV (less than a visible sunburn) resulted in a decreased frequency of strongly positive responses (32%) as compared to controls (73%) (P = 0.019). The rate of immunologic tolerance to DNCB (active suppression of a subsequent repeat immunization) in the groups that were initially sensitized on skin receiving erythemagenic doses of UV was 31% (P = 0.0003). In addition, a localized moderate sunburn appeared to modulate immunization with diphenylcyclopropenone through a distant, unirradiated site (41% weak responses) as compared to the control group (9%) (P = 0.05). Monitoring antigen presenting cell content in the epidermis revealed that erythemagenic regimens induced CD1a-DR+ macrophages and depleted Langerhans cells. In conclusion, relevant and even subclinical levels of UV exposure have significant down modulatory effects on the ability of humans to generate a T-cell-mediated response to antigens introduced through irradiated skin.

366 citations

Journal ArticleDOI
TL;DR: In this paper, a general approach to estimating linear statistical relationships is presented, which includes three lectures on linear functional and structural relationships, factor analysis, and simultaneous equations models, focusing on the similarity of maximum likelihood estimators under normality in the different models.
Abstract: This paper on estimating linear statistical relationships includes three lectures on linear functional and structural relationships, factor analysis, and simultaneous equations models. The emphasis is on relating the several models by a general approach and on the similarity of maximum likelihood estimators (under normality) in the different models. In the first two lectures the observable vector is decomposed into a "systematic part" and a random error; the systematic part satisfies the linear relationships. Estimators are derived for several cases and some of their properties given. Estimation of the coefficients of a single equation in a simultaneous equations model is shown to be equivalent to estimation of linear functional relationships.

272 citations

Journal ArticleDOI
TL;DR: In this article, the consistency of the estimates and the asymptotic distributions of the estimate and the test criteria are studied under conditions more general than those used in the derivation of these estimates and criteria.
Abstract: In a previous paper [2] the authors have given a method for estimating the coefficients of a single equation in a complete system of linear stochastic equations. In the present paper the consistency of the estimates and the asymptotic distributions of the estimates and the test criteria are studied under conditions more general than those used in the derivation of these estimates and criteria. The point estimates, which can be obtained as maximum likelihood estimates under certain assumptions including that of normality of disturbances, are consistent even if the disturbances are not normally distributed and (a) some predetermined variables are neglected (Theorem 1) or (b) the single equation is in a non-linear system with certain properties (Theorem 2). Under certain general conditions (normality of the disturbances not being required) the estimates are asymptotically normally distributed (Theorems 3 and 4). The asymptotic covariance matrix is given for several cases. The criteria derived in [2] for testing the hypothesis of over-identification have, asymptotically, $\chi^2$-distributions (Theorem 5). The exact confidence regions developed in [2] for the case that all predetermined variables are exogenous (that is, that the difference equations are of zero order) are shown to be consistent and to hold asymptotically even when this assumption is not true (Theorem 6).

263 citations

Journal ArticleDOI
TL;DR: In this article, it is shown that a particular centering of the maximum likelihood estimator derived under assumed normality of observations yields an asymptotic normal distribution that is common to a wide class of distributions of the factor vectors and error vectors.
Abstract: Asymptotic properties of estimators for the confirmatory factor analysis model are discussed. The model is identified by restrictions on the elements of the factor loading matrix; the number of restrictions may exceed that required for identification. It is shown that a particular centering of the maximum likelihood estimator derived under assumed normality of observations yields an asymptotic normal distribution that is common to a wide class of distributions of the factor vectors and error vectors. In particular, the asymptotic covariance matrix of the factor loading estimator derived under the normal assumption is shown to be valid for the factor vectors containing a fixed part and a random part with any distribution having finite second moments and for the error vectors consisting of independent components with any distributions having finite second moments. Thus the asymptotic standard errors of the factor loading estimators computed by standard computer packages are valid for virtually any type of nonnormal factor analysis. The results are extended to certain structural equation models.

221 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, the adequacy of the conventional cutoff criteria and several new alternatives for various fit indexes used to evaluate model fit in practice were examined, and the results suggest that, for the ML method, a cutoff value close to.95 for TLI, BL89, CFI, RNI, and G...
Abstract: This article examines the adequacy of the “rules of thumb” conventional cutoff criteria and several new alternatives for various fit indexes used to evaluate model fit in practice. Using a 2‐index presentation strategy, which includes using the maximum likelihood (ML)‐based standardized root mean squared residual (SRMR) and supplementing it with either Tucker‐Lewis Index (TLI), Bollen's (1989) Fit Index (BL89), Relative Noncentrality Index (RNI), Comparative Fit Index (CFI), Gamma Hat, McDonald's Centrality Index (Mc), or root mean squared error of approximation (RMSEA), various combinations of cutoff values from selected ranges of cutoff criteria for the ML‐based SRMR and a given supplemental fit index were used to calculate rejection rates for various types of true‐population and misspecified models; that is, models with misspecified factor covariance(s) and models with misspecified factor loading(s). The results suggest that, for the ML method, a cutoff value close to .95 for TLI, BL89, CFI, RNI, and G...

76,383 citations

Journal ArticleDOI
TL;DR: In this article, a new estimate minimum information theoretical criterion estimate (MAICE) is introduced for the purpose of statistical identification, which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure.
Abstract: The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as the procedure for statistical model identification. The classical maximum likelihood estimation procedure is reviewed and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. When there are several competing models the MAICE is defined by the model and the maximum likelihood estimates of the parameters which give the minimum of AIC defined by AIC = (-2)log-(maximum likelihood) + 2(number of independently adjusted parameters within the model). MAICE provides a versatile procedure for statistical model identification which is free from the ambiguities inherent in the application of conventional hypothesis testing procedure. The practical utility of MAICE in time series analysis is demonstrated with some numerical examples.

47,133 citations

Book
01 Jan 2001
TL;DR: This is the essential companion to Jeffrey Wooldridge's widely-used graduate text Econometric Analysis of Cross Section and Panel Data (MIT Press, 2001).
Abstract: The second edition of this acclaimed graduate text provides a unified treatment of two methods used in contemporary econometric research, cross section and data panel methods. By focusing on assumptions that can be given behavioral content, the book maintains an appropriate level of rigor while emphasizing intuitive thinking. The analysis covers both linear and nonlinear models, including models with dynamics and/or individual heterogeneity. In addition to general estimation frameworks (particular methods of moments and maximum likelihood), specific linear and nonlinear methods are covered in detail, including probit and logit models and their multivariate, Tobit models, models for count data, censored and missing data schemes, causal (or treatment) effects, and duration analysis. Econometric Analysis of Cross Section and Panel Data was the first graduate econometrics text to focus on microeconomic data structures, allowing assumptions to be separated into population and sampling assumptions. This second edition has been substantially updated and revised. Improvements include a broader class of models for missing data problems; more detailed treatment of cluster problems, an important topic for empirical researchers; expanded discussion of "generalized instrumental variables" (GIV) estimation; new coverage (based on the author's own recent research) of inverse probability weighting; a more complete framework for estimating treatment effects with panel data, and a firmly established link between econometric approaches to nonlinear panel data and the "generalized estimating equation" literature popular in statistics and other fields. New attention is given to explaining when particular econometric methods can be applied; the goal is not only to tell readers what does work, but why certain "obvious" procedures do not. The numerous included exercises, both theoretical and computer-based, allow the reader to extend methods covered in the text and discover new insights.

28,298 citations

Journal ArticleDOI
TL;DR: In this article, the generalized method of moments (GMM) estimator optimally exploits all the linear moment restrictions that follow from the assumption of no serial correlation in the errors, in an equation which contains individual effects, lagged dependent variables and no strictly exogenous variables.
Abstract: This paper presents specification tests that are applicable after estimating a dynamic model from panel data by the generalized method of moments (GMM), and studies the practical performance of these procedures using both generated and real data. Our GMM estimator optimally exploits all the linear moment restrictions that follow from the assumption of no serial correlation in the errors, in an equation which contains individual effects, lagged dependent variables and no strictly exogenous variables. We propose a test of serial correlation based on the GMM residuals and compare this with Sargan tests of over-identifying restrictions and Hausman specification tests.

26,580 citations

Book
B. J. Winer1
01 Jan 1962
TL;DR: In this article, the authors introduce the principles of estimation and inference: means and variance, means and variations, and means and variance of estimators and inferors, and the analysis of factorial experiments having repeated measures on the same element.
Abstract: CHAPTER 1: Introduction to Design CHAPTER 2: Principles of Estimation and Inference: Means and Variance CHAPTER 3: Design and Analysis of Single-Factor Experiments: Completely Randomized Design CHAPTER 4: Single-Factor Experiments Having Repeated Measures on the Same Element CHAPTER 5: Design and Analysis of Factorial Experiments: Completely-Randomized Design CHAPTER 6: Factorial Experiments: Computational Procedures and Numerical Example CHAPTER 7: Multifactor Experiments Having Repeated Measures on the Same Element CHAPTER 8: Factorial Experiments in which Some of the Interactions are Confounded CHAPTER 9: Latin Squares and Related Designs CHAPTER 10: Analysis of Covariance

25,607 citations