Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices
T. Tony Cai,Tiefeng Jiang +1 more
Reads0
Chats0
TLDR
In this paper, the limiting laws of the coherence of an n × p random matrix in the high-dimensional setting where p can be much larger than n were derived and applied to the construction of compressed sensing matrices.Abstract:
Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n × p random matrix in the high-dimensional setting where p can be much larger than n. Both the law of large numbers and the limiting distribution are derived. We then consider testing the bandedness of the covariance matrix of a high-dimensional Gaussian distribution which includes testing for independence as a special case. The limiting laws of the coherence of the data matrix play a critical role in the construction of the test. We also apply the asymptotic results to the construction of compressed sensing matrices.read more
Citations
More filters
Posted Content
A Very Simple, Positive Semi-Definite, Heteroskedasticity and Autocorrelation Consistent Covariance Matrix
Book
Sampling Theory: Beyond Bandlimited Systems
TL;DR: This book provides a comprehensive guide to the theory and practice of sampling from an engineering perspective and is also an invaluable reference or self-study guide for engineers and students across industry and academia.
Journal ArticleDOI
Bayesian linear regression with sparse priors
TL;DR: Under compatibility conditions on the design matrix, the posterior distribution is shown to contract at the optimal rate for recovery of the unknown sparse vector, and to give optimal prediction of the response vector.
Journal ArticleDOI
Two-Sample Covariance Matrix Testing and Support Recovery in High-Dimensional and Sparse Settings
T. Tony Cai,Weidong Liu,Yin Xia +2 more
TL;DR: A new test for testing the hypothesis H 0 is proposed and investigated to enjoy certain optimality and to be especially powerful against sparse alternatives and applications to gene selection are discussed.
Journal ArticleDOI
Bayesian linear regression with sparse priors
TL;DR: In this paper, the posterior distribution of a high-dimensional linear regression under sparsity constraints is characterized and employed to the construction and study of credible sets for uncertainty quantification, where the prior is a mixture of point masses at zero and continuous distributions.
References
More filters
Book
Compressed sensing
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
ReportDOI
A simple, positive semi-definite, heteroskedasticity and autocorrelation consistent covariance matrix
Whitney K. Newey,Kenneth D. West +1 more
TL;DR: In this article, a simple method of calculating a heteroskedasticity and autocorrelation consistent covariance matrix that is positive semi-definite by construction is described.
Journal ArticleDOI
Decoding by linear programming
Emmanuel J. Candès,Terence Tao +1 more
TL;DR: F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Book
Large Deviations Techniques and Applications
Amir Dembo,Ofer Zeitouni +1 more
TL;DR: The LDP for Abstract Empirical Measures and applications-The Finite Dimensional Case and Applications of Empirically Measures LDP are presented.
Journal ArticleDOI
Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation
TL;DR: Using these results, data-dependent automatic bandwidth/lag truncation parameters are introduced and asymptotically optimal kernel/weighting scheme and bandwidth/agreement parameters are obtained.