scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Linear Statistical Inference and Its Applications.

01 Sep 1975-Biometrics-Vol. 31, Iss: 3, pp 791
About: This article is published in Biometrics.The article was published on 1975-09-01. It has received 4122 citations till now. The article focuses on the topics: Statistical inference.
Citations
More filters
Book
01 Jan 1983
TL;DR: In this paper, a generalization of the analysis of variance is given for these models using log- likelihoods, illustrated by examples relating to four distributions; the Normal, Binomial (probit analysis, etc.), Poisson (contingency tables), and gamma (variance components).
Abstract: The technique of iterative weighted linear regression can be used to obtain maximum likelihood estimates of the parameters with observations distributed according to some exponential family and systematic effects that can be made linear by a suitable transformation. A generalization of the analysis of variance is given for these models using log- likelihoods. These generalized linear models are illustrated by examples relating to four distributions; the Normal, Binomial (probit analysis, etc.), Poisson (contingency tables) and gamma (variance components).

23,215 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors, and derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number of dimensions.
Abstract: We consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors. We then derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number of dimensions. Further we test linear hypotheses about the cointegration vectors. The asymptotic distribution of these test statistics are found and the first is described by a natural multivariate version of the usual test for unit root in an autoregressive process, and the other is a x2 test. 1. Introduction The idea of using cointegration vectors in the study of nonstationary time series comes from the work of Granger (1981), Granger and Weiss (1983), Granger and Engle (1985), and Engle and Granger (1987). The connection with error correcting models has been investigated by a number of authors; see Davidson (1986), Stock (1987), and Johansen (1988) among others. Granger and Engle (1987) suggest estimating the cointegration relations using regression, and these estimators have been investigated by Stock (1987), Phillips (1985), Phillips and Durlauf (1986), Phillips and Park (1986a, b, 1987), Phillips and Ouliaris (1986,1987), Stock and Watson (1987), and Sims, Stock and Watson (1986). The purpose of this paper is to derive maximum likelihood estimators of the cointegration vectors for an autoregressive process with independent Gaussian errors, and to derive a likelihood ratio test for the hypothesis that there is a given number of these. A similar approach has been taken by Ahn and Reinsel (1987). This program will not only give good estimates and test statistics in the Gaussian case, but will also yield estimators and tests, the properties of which can be investigated under various other assumptions about the underlying data generating process. The reason for expecting the estimators to behave better *The simulations were carefully performed by Marc Andersen with the support of the Danish Social Science Research Council. The author is very grateful to the referee whose critique of the first version greatly helped improve the presentation.

16,189 citations

Journal ArticleDOI
TL;DR: For comments on an earlier draft of this chapter and for detailed advice I am indebted to Robert M. Hauser, Halliman H. Winsborough, Toni Richards, several anonymous reviewers, and the editor of this volume as discussed by the authors.
Abstract: For comments on an earlier draft of this chapter and for detailed advice I am indebted to Robert M. Hauser, Halliman H. Winsborough, and Toni Richards, several anonymous reviewers, and the editor of this volume. I also wish to thank John Raisian, Nancy Rytina, and Barbara Mann for their comments and Mark Wilson for able research assistance. The opinions expressed here are the sole responsibility of the author.

11,160 citations

Journal ArticleDOI
TL;DR: An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time and may actually be seen as an extension of the principal component analysis (PCA).
Abstract: The independent component analysis (ICA) of a random vector consists of searching for a linear transformation that minimizes the statistical dependence between its components. In order to define suitable search criteria, the expansion of mutual information is utilized as a function of cumulants of increasing orders. An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time. The concept of ICA may actually be seen as an extension of the principal component analysis (PCA), which can only impose independence up to the second order and, consequently, defines directions that are orthogonal. Potential applications of ICA include data analysis and compression, Bayesian detection, localization of sources, and blind identification and deconvolution.

8,522 citations

Journal ArticleDOI
TL;DR: The bootstrap is extended to other measures of statistical accuracy such as bias and prediction error, and to complicated data structures such as time series, censored data, and regression models.
Abstract: This is a review of bootstrap methods, concentrating on basic ideas and applications rather than theoretical considerations. It begins with an exposition of the bootstrap estimate of standard error for one-sample situations. Several examples, some involving quite complicated statistical procedures, are given. The bootstrap is then extended to other measures of statistical accuracy such as bias and prediction error, and to complicated data structures such as time series, censored data, and regression models. Several more examples are presented illustrating these ideas. The last third of the paper deals mainly with bootstrap confidence intervals.

5,894 citations


Cites background or methods from "Linear Statistical Inference and It..."

  • ...7) is usually obtained by differentiating the log likelihood function, see Section 5a of Rao (1973), although in the context of this paper we might prefer to use the parametric bootstrap estimate of a, e....

    [...]

  • ...By now it should be clear that we can use any random variable R(y, F) to measure accuracy, not just (4.1) or (4.6), and then estimate EFIR(y, F)} by its bootstrap value EpIR(y*, F1) }- b=1 R(y*(b), F)/B. Similarly we can estimate EFR(y, F)2 by EpR(y*, F)2, etc. Efron (1983) considers the prediction problem, in which a training set of data is used to construct a prediction rule....

    [...]

  • ...R. Tibshirani is a Postdoctoral Fellow in the Department of Preventive Medicine and Biostatistics, Faculty of Medicine, University of Toronto, McMurrick Building, Toronto, Ontario, M5S 1A8, Canada. particularly Efron (1982a). Some of the discussion here is abridged from Efron and Gong (1983) and also from Efron (1984)....

    [...]