Open AccessPosted Content
Convergence bounds for nonlinear least squares and applications to tensor recovery.
TLDR
In this paper, the problem of approximating a function in general nonlinear subsets of $L 2$ when only a weighted Monte Carlo estimate of the norm can be computed was considered.Abstract:
We consider the problem of approximating a function in general nonlinear subsets of $L^2$ when only a weighted Monte Carlo estimate of the $L^2$-norm can be computed. Of particular interest in this setting is the concept of sample complexity, the number of samples that are necessary to recover the best approximation. Bounds for this quantity have been derived in a previous work and depend primarily on the model class and are not influenced positively by the regularity of the sought function. This result however is only a worst-case bound and is not able to explain the remarkable performance of iterative hard thresholding algorithms that is observed in practice. We reexamine the results of the previous paper and derive a new bound that is able to utilize the regularity of the sought function. A critical analysis of our results allows us to derive a sample efficient algorithm for the model set of low-rank tensors. The viability of this algorithm is demonstrated by recovering quantities of interest for a classical high-dimensional random partial differential equation.read more
Citations
More filters
Posted Content
Adaptive non-intrusive reconstruction of solutions to high-dimensional parametric PDEs
TL;DR: In this paper, a non-intrusive generalization of the adaptive Galerkin FEM with residual based error estimation is proposed, which is steered by a reliable error estimator.
References
More filters
Journal ArticleDOI
Matplotlib: A 2D Graphics Environment
TL;DR: Matplotlib is a 2D graphics package used for Python for application development, interactive scripting, and publication-quality image generation across user interfaces and operating systems.
Journal ArticleDOI
Array programming with NumPy
Charles R. Harris,K. Jarrod Millman,Stefan van der Walt,Stefan van der Walt,Ralf Gommers,Pauli Virtanen,David Cournapeau,Eric Wieser,Julian Taylor,Sebastian Berg,Nathaniel J. Smith,Robert Kern,Matti Picus,Stephan Hoyer,Marten H. van Kerkwijk,Matthew Brett,Matthew Brett,Allan Haldane,Jaime Fernández del Río,Mark Wiebe,Mark Wiebe,Pearu Peterson,Pierre Gérard-Marchant,Kevin Sheppard,Tyler Reddy,Warren Weckesser,Hameer Abbasi,Christoph Gohlke,Travis E. Oliphant +28 more
TL;DR: In this paper, the authors review how a few fundamental array concepts lead to a simple and powerful programming paradigm for organizing, exploring and analysing scientific data, and their evolution into a flexible interoperability layer between increasingly specialized computational libraries is discussed.
Journal ArticleDOI
Stable signal recovery from incomplete and inaccurate measurements
TL;DR: In this paper, the authors considered the problem of recovering a vector x ∈ R^m from incomplete and contaminated observations y = Ax ∈ e + e, where e is an error term.
Journal ArticleDOI
Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
TL;DR: It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.
Journal ArticleDOI
The Power of Convex Relaxation: Near-Optimal Matrix Completion
Emmanuel J. Candès,Terence Tao +1 more
TL;DR: This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors).
Related Papers (5)
Convex Regularization for High-Dimensional Multi-Response Tensor Regression
Adaptive Near-Optimal Rank Tensor Approximation for High-Dimensional Operator Equations
Markus Bachmayr,Wolfgang Dahmen +1 more