scispace - formally typeset
Open AccessJournal ArticleDOI

Kolmogorov widths and low-rank approximations of parametric elliptic PDEs

Markus Bachmayr, +1 more
- 20 Jul 2016 - 
- Vol. 86, Iss: 304, pp 701-724
Reads0
Chats0
TLDR
In this paper, the decay of the n-widths can be controlled by that of the error achieved by best n-term approximations using polynomials in the parametric variable.
Abstract
Kolmogorov n-widths and low-rank approximations are studied for families of ellip-tic diffusion PDEs parametrized by the diffusion coefficients. The decay of the n-widths can be controlled by that of the error achieved by best n-term approximations using polynomials in the parametric variable. However, we prove that in certain relevant instances where the diffusion coefficients are piecewise constant over a partition of the physical domain, the n-widths exhibit significantly faster decay. This, in turn, yields a theoretical justification of the fast convergence of reduced basis or POD methods when treating such parametric PDEs. Our results are confirmed by numerical experiments, which also reveal the influence of the partition geometry on the decay of the n-widths.

read more

Citations
More filters
Journal ArticleDOI

Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders

TL;DR: The ability of the method to significantly outperform even the optimal linear-subspace ROM on benchmark advection-dominated problems is demonstrated, thereby demonstrating the method's ability to overcome the intrinsic $n$-width limitations of linear subspaces.
Journal ArticleDOI

Tensor Networks and Hierarchical Tensors for the Solution of High-Dimensional Partial Differential Equations

TL;DR: A survey of developments of techniques for the computation of hierarchical low-rank approximations, including local optimisation techniques on Riemannian manifolds as well as truncated iteration methods, which can be applied for solving high-dimensional partial differential equations.
Journal ArticleDOI

A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

TL;DR: The existence of a small reduced basis is used to construct neural networks that yield approximations of the parametric solution maps in such a way that the sizes of these networks essentially only depend on the size of the reduced basis.
Posted Content

A Theoretical Analysis of Deep Neural Networks and Parametric PDEs

TL;DR: In this article, the authors derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations, without any knowledge of its concrete shape, using the inherent low-dimensionality of the solution manifold to obtain approximation rates which are significantly superior to those provided by classical neural networks.
Journal ArticleDOI

Practical error bounds for a non-intrusive bi-fidelity approach to parametric/stochastic model reduction

TL;DR: This work derives a novel, pragmatic estimate for the error committed by this bi-fidelity model that relies on the low-rank structure of the map between model parameters/uncertain inputs and the solution of interest, if exists and shows that this error bound can be used to determine if a given pair of low- and high-f fidelity models will lead to an accurate bi- fidelity approximation.
References
More filters
Journal ArticleDOI

Reduced basis approximation and a posteriori error estimation for affinely parametrized elliptic coercive partial differential equations

TL;DR: (hierarchical, Lagrange) reduced basis approximation and a posteriori error estimation for linear functional outputs of affinely parametrized elliptic coercive partial differential equations are considered.
Journal ArticleDOI

Convergence Rates for Greedy Algorithms in Reduced Basis Methods

TL;DR: The reduced basis method was introduced for the accurate online evaluation of solutions to a parameter dependent family of elliptic PDEs by determining a “good” n-dimensional space to be used in approximating the elements of a compact set $\mathcal{F}$ in a Hilbert space $\ mathscal{H}$.
Journal ArticleDOI

Analytic regularity and polynomial approximation of parametric and stochastic elliptic pde's

TL;DR: In this article, the authors considered a model class of second order, linear, parametric, elliptic PDE's in a bounded domain D with coefficients depending on possibly countably many parameters and showed that the dependence of the solution on the parameters in the diffusion coefficient is analytically smooth.
Journal ArticleDOI

Convergence Rates of Best N -term Galerkin Approximations for a Class of Elliptic sPDEs

TL;DR: New regularity theorems describing the smoothness properties of the solution u as a map from y∈U=(−1,1)∞ to a smoothness space W⊂V are established leading to analytic estimates on the W norms of the gpc coefficients and on their space discretization error.
Journal ArticleDOI

Sparse tensor discretizations of high-dimensional parametric and stochastic PDEs

TL;DR: Partial differential equations with random input data, such as random loadings and coefficients, are reformulated as parametric, deterministic PDEs on parameter spaces of high, possibly infinite dimension to derive representation of the random solutions' laws on infinite-dimensional parameter spaces in terms of ‘generalized polynomial chaos’ (GPC) series.
Related Papers (5)