Book ChapterDOI
Fitting Multidimensional Data Using Gradient Penalties and Combination Techniques
Jochen Garcke,Markus Hegland +1 more
- pp 235-248
Reads0
Chats0
TLDR
This investigation shows how overfitting arises when the mesh size goes to zero and the application of modified “optimal” combination coefficients provides an advantage over the ones used originally for the numerical solution of PDEs, who in this case simply amplify the sampling noise.Abstract:
Sparse grids, combined with gradient penalties provide an attractive tool for regularised least squares fitting. It has earlier been found that the combination technique, which allows the approximation of the sparse grid fit with a linear combination of fits on partial grids, is here not as effective as it is in the case of elliptic partial differential equations. We argue that this is due to the irregular and random data distribution, as well as the proportion of the number of data to the grid resolution. These effects are investigated both in theory and experiments. The application of modified “optimal” combination coefficients provides an advantage over the ones used originally for the numerical solution of PDEs, who in this case simply amplify the sampling noise. As part of this investigation we also show how overfitting arises when the mesh size goes to zero.read more
Citations
More filters
Journal ArticleDOI
Principal manifold learning by sparse grids
TL;DR: This paper considers principal manifolds as the minimum of a regularized, non-linear empirical quantization error functional, and uses a sparse grid method in latent parameter space for the discretization.
Book ChapterDOI
Recent Developments in the Theory and Application of the Sparse Grid Combination Technique
TL;DR: Substantial modifications of both the choice of the grids, the combination coefficients, the parallel data structures and the algorithms used for the combination technique lead to numerical methods which are scalable and which are shown to have good performance.
References
More filters
Book
Spline models for observational data
TL;DR: In this paper, a theory and practice for the estimation of functions from noisy data on functionals is developed, where convergence properties, data based smoothing parameter selection, confidence intervals, and numerical methods are established which are appropriate to a number of problems within this framework.
Journal ArticleDOI
Data mining with sparse grids
TL;DR: It turns out that the new method achieves correctness rates which are competitive to that of the best existing methods, i.e. the amount of data to be classified.
Book ChapterDOI
Rate of Convergence of the Method of Alternating Projections
TL;DR: In this paper, a proof is given of a rate of convergence theorem for the method of alternating projections for alternating projections, which had been announced earlier in [8] without proof.
Related Papers (5)
Variance-based global sensitivity analysis via sparse-grid interpolation and cubature
Gregery T. Buzzard,Dongbin Xiu +1 more