scispace - formally typeset
Search or ask a question
Book

An Introduction to Multivariate Statistical Analysis

14 Sep 1984-
TL;DR: In this article, the distribution of the Mean Vector and the Covariance Matrix and the Generalized T2-Statistic is analyzed. But the distribution is not shown to be independent of sets of Variates.
Abstract: Preface to the Third Edition.Preface to the Second Edition.Preface to the First Edition.1. Introduction.2. The Multivariate Normal Distribution.3. Estimation of the Mean Vector and the Covariance Matrix.4. The Distributions and Uses of Sample Correlation Coefficients.5. The Generalized T2-Statistic.6. Classification of Observations.7. The Distribution of the Sample Covariance Matrix and the Sample Generalized Variance.8. Testing the General Linear Hypothesis: Multivariate Analysis of Variance9. Testing Independence of Sets of Variates.10. Testing Hypotheses of Equality of Covariance Matrices and Equality of Mean Vectors and Covariance Matrices.11. Principal Components.12. Cononical Correlations and Cononical Variables.13. The Distributions of Characteristic Roots and Vectors.14. Factor Analysis.15. Pattern of Dependence Graphical Models.Appendix A: Matrix Theory.Appendix B: Tables.References.Index.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors consider a nonstationary vector autoregressive process which is integrated of order 1, and generated by i.i.d. Gaussian errors, and derive the maximum likelihood estimator of the space of cointegration vectors and the likelihood ratio test of the hypothesis that it has a given number of dimensions.

16,189 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss the problem of estimating the sampling distribution of a pre-specified random variable R(X, F) on the basis of the observed data x.
Abstract: We discuss the following problem given a random sample X = (X 1, X 2,…, X n) from an unknown probability distribution F, estimate the sampling distribution of some prespecified random variable R(X, F), on the basis of the observed data x. (Standard jackknife theory gives an approximate mean and variance in the case R(X, F) = \(\theta \left( {\hat F} \right) - \theta \left( F \right)\), θ some parameter of interest.) A general method, called the “bootstrap”, is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

14,483 citations

Journal ArticleDOI
TL;DR: In this paper, the estimation and testing of long-run relations in economic modeling are addressed, starting with a vector autoregressive (VAR) model, the hypothesis of cointegration is formulated as a hypothesis of reduced rank of the long run impact matrix.
Abstract: The estimation and testing of long-run relations in economic modeling are addressed. Starting with a vector autoregressive (VAR) model, the hypothesis of cointegration is formulated as the hypothesis of reduced rank of the long-run impact matrix. This is given in a simple parametric form that allows the application of the method of maximum likelihood and likelihood ratio tests. In this way, one can derive estimates and test statistics for the hypothesis of a given number of cointegration vectors, as well as estimates and tests for linear hypotheses about the cointegration vectors and their weights. The asymptotic inferences concerning the number of cointegrating vectors involve nonstandard distributions. Inference concerning linear restrictions on the cointegration vectors and their weights can be performed using the usual chi squared methods. In the case of linear restrictions on beta, a Wald test procedure is suggested. The proposed methods are illustrated by money demand data from the Danish and Finnish economies.

12,449 citations

Journal ArticleDOI
TL;DR: It is suggested that if Guttman's latent-root-one lower bound estimate for the rank of a correlation matrix is accepted as a psychometric upper bound, then the rank for a sample matrix should be estimated by subtracting out the component in the latent roots which can be attributed to sampling error.
Abstract: It is suggested that if Guttman's latent-root-one lower bound estimate for the rank of a correlation matrix is accepted as a psychometric upper bound, following the proofs and arguments of Kaiser and Dickman, then the rank for a sample matrix should be estimated by subtracting out the component in the latent roots which can be attributed to sampling error, and least-squares “capitalization” on this error, in the calculation of the correlations and the roots. A procedure based on the generation of random variables is given for estimating the component which needs to be subtracted.

6,722 citations

Journal ArticleDOI
TL;DR: In this article, an approximate procedure based on classical analysis of variance is presented, including an adjustment to the degrees of freedom resulting in conservative F tests, which can be applied to the case where the variance covariance matrices differ from group to group.
Abstract: This paper is concerned with methods for analyzing quantitative, non-categorical profile data, e.g., a battery of tests given to individuals in one or more groups. It is assumed that the variables have a multinormal distribution with an arbitrary variance-covariance matrix. Approximate procedures based on classical analysis of variance are presented, including an adjustment to the degrees of freedom resulting in conservativeF tests. These can be applied to the case where the variance-covariance matrices differ from group to group. In addition, exact generalized multivariate analysis methods are discussed. Examples are given illustrating both techniques.

4,638 citations