scispace - formally typeset
Open AccessJournal ArticleDOI

Estimating covariance and precision matrices along subspaces

Željko Kereta, +1 more
- 01 Jan 2021 - 
- Vol. 15, Iss: 1, pp 554-588
TLDR
The results show that the estimation accuracy depends almost exclusively on the components of the distribution that correspond to desired subspaces or directions, relevant and important for problems where the behavior of data along a lower-dimensional space is of specific interest, such as dimension reduction or structured regression problems.
Abstract
We study the accuracy of estimating the covariance and the precision matrix of a $D$-variate sub-Gaussian distribution along a prescribed subspace or direction using the finite sample covariance. Our results show that the estimation accuracy depends almost exclusively on the components of the distribution that correspond to desired subspaces or directions. This is relevant and important for problems where the behavior of data along a lower-dimensional space is of specific interest, such as dimension reduction or structured regression problems. We also show that estimation of precision matrices is almost independent of the condition number of the covariance matrix. The presented applications include direction-sensitive eigenspace perturbation bounds, relative bounds for the smallest eigenvalue, and the estimation of the single-index model. For the latter, a new estimator, derived from the analysis, with strong theoretical guarantees and superior numerical performance is proposed.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Estimating multi-index models with response-conditional least squares

TL;DR: A method is introduced for the estimation of the index space, and a tight concentration bound is proved that shows $N^{-1/2}$-convergence, but also faithfully describes the dependence on the chosen partition of level sets, hence giving indications on the hyperparameter tuning.
Journal ArticleDOI

Nonlinear generalization of the monotone single index model

TL;DR: A nonlinear generalization of the single index model to allow for a regressor that uses multiple index vectors, adapting to local changes in the responses, and presents theoretical guarantees for estimation of local index vectors and out-of-sample prediction.
Journal ArticleDOI

Learning to Bound: A Generative Cramér-Rao Bound

TL;DR: A novel approach to approximate the CRB using data-driven methods, which removes the requirement for an analytical statistical model, is introduced, based on the recent success of deep generative models in modeling complex, high-dimensional distributions.
Journal ArticleDOI

On confidence intervals for precision matrices and the eigendecomposition of covariance matrices

TL;DR: A new statistical test is demonstrated, which allows us to test for non-zero values of the precision matrix and makes use of the theory of U-statistics to bound the L 2 perturbation of the empirical covariance matrix.

An extended latent factor framework for ill-posed linear regression

TL;DR: In this paper , the classical latent factor model for linear regression is extended by assuming that, up to an unknown orthogonal transformation, the features consist of subsets that are relevant and irrelevant for the response, and a joint low-dimensionality is imposed only on the relevant features vector and the response variable.
References
More filters
Journal ArticleDOI

Sparse inverse covariance estimation with the graphical lasso

TL;DR: Using a coordinate descent procedure for the lasso, a simple algorithm is developed that solves a 1000-node problem in at most a minute and is 30-4000 times faster than competing methods.
Journal ArticleDOI

High-dimensional graphs and variable selection with the Lasso

TL;DR: It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

Introduction To Multivariate Statistical Analysis

Anja Vogler
TL;DR: The introduction to multivariate statistical analysis is universally compatible with any devices to read, and will help you to cope with some harmful bugs inside their desktop computer.
Journal ArticleDOI

Estimation of the Mean of a Multivariate Normal Distribution

Charles Stein
- 01 Nov 1981 - 
TL;DR: In this article, an unbiased estimate of risk is obtained for an arbitrary estimate, and certain special classes of estimates are then discussed, such as smoothing by using moving averages and trimmed analogs of the James-Stein estimate.
Book ChapterDOI

Introduction to the non-asymptotic analysis of random matrices.

TL;DR: This is a tutorial on some basic non-asymptotic methods and concepts in random matrix theory, particularly for the problem of estimating covariance matrices in statistics and for validating probabilistic constructions of measurementMatrices in compressed sensing.
Related Papers (5)