scispace - formally typeset
Open AccessJournal Article

Restricted strong convexity and weighted matrix completion: optimal bounds with noise

TLDR
In this article, the authors considered the matrix completion problem under a form of row/column weighted entrywise sampling, including the case of uniformentrywise sampling as a special case.
Abstract
We consider the matrix completion problem under a form of row/column weighted entrywise sampling, including the case of uniformentrywise sampling as a special case. We analyze the associated random observation operator, and prove that with high probability, it satisfies a form of restricted strong convexity with respect to weighted Frobenius norm. Using this property, we obtain as corollaries a number of error bounds on matrix completion in the weighted Frobenius norm under noisy sampling and for both exact and near low-rank matrices. Our results are based on measures of the 'spikiness' and 'low-rankness' of matrices that are less restrictive than the incoherence conditions imposed in previous work. Our technique involves an M-estimator that includes controls on both the rank and spikiness of the solution, and we establish non-asymptotic error bounds in weighted Frobenius norm for recovering matrices lying with lq-"balls" of bounded spikiness. Using information-theoretic methods, we show that no algorithm can achieve better estimates (up to a logarithmic factor) over these same sets, showing that our conditions on matrices and associated rates are essentially optimal.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

A unified framework for high-dimensional analysis of M-estimators with decomposable regularizers

TL;DR: A unified framework for establishing consistency and convergence rates for regularized M-estimators under high-dimensional scaling is provided and one main theorem is state and shown how it can be used to re-derive several existing results, and also to obtain several new results.
Journal ArticleDOI

A Unified Framework for High-Dimensional Analysis of $M$-Estimators with Decomposable Regularizers

TL;DR: In this paper, a unified framework for establishing consistency and convergence rates for regularized M$-estimators under high-dimensional scaling was provided, which can be used to re-derive some existing results.
Book

High-Dimensional Statistics: A Non-Asymptotic Viewpoint

TL;DR: This book provides a self-contained introduction to the area of high-dimensional statistics, aimed at the first-year graduate level, and includes chapters that are focused on core methodology and theory - including tail bounds, concentration inequalities, uniform laws and empirical process, and random matrices.
Journal ArticleDOI

Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion

TL;DR: In this article, a new nuclear-norm penalized estimator of A0 was proposed and established a general sharp oracle inequality for this estimator for arbitrary values of n, m1, m2 under the condition of isometry in expectation.
Journal ArticleDOI

Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions

TL;DR: In this paper, a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems is analyzed. But the results are restricted to matrix decompositions.
References
More filters
Book

Matrix Analysis

TL;DR: In this article, the authors present results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrate their importance in a variety of applications, such as linear algebra and matrix theory.
Book

Nonlinear Programming

Journal ArticleDOI

Exact Matrix Completion via Convex Optimization

TL;DR: It is proved that one can perfectly recover most low-rank matrices from what appears to be an incomplete set of entries, and that objects other than signals and images can be perfectly reconstructed from very limited information.
Journal ArticleDOI

Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization

TL;DR: It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.
Book ChapterDOI

Introduction to the non-asymptotic analysis of random matrices.

TL;DR: This is a tutorial on some basic non-asymptotic methods and concepts in random matrix theory, particularly for the problem of estimating covariance matrices in statistics and for validating probabilistic constructions of measurementMatrices in compressed sensing.
Related Papers (5)