scispace - formally typeset
Search or ask a question
Topic

Sparse grid

About: Sparse grid is a research topic. Over the lifetime, 1013 publications have been published within this topic receiving 20664 citations.


Papers
More filters
01 Jan 2014
TL;DR: This work found that the average of the local error r at data points xi weighted with the evaluation of the basis function '(xi) in question is suited as a refinement indicator, allowing us to tackle large, high dimensional datasets more efficiently.
Abstract: unchen The amount of available data increases rapidly. This trend, often related to as Big Data challenges modern data mining algorithms, requiring new methods that can cope with very large, multi-variate regression problems. A promising approach that can tackle non-linear, higher-dimensional problems is regression using sparse grids. Sparse grids use a multi- scale system of grids with basis functions ' with local support to circumvent the curse of dimensionality (BG04). Sparse grid based regression (Pfl10) iteratively constructs a function ˆ f(x) = P ii'i(x) that approximates the solution of the regression problem f finding optimal weightsi for a fixed grid using least squares optimization. For further error reduction, refinement algorithms then add new basis functions and the optimization is repeated. Current refinement algorithms add hierarchical child functions in all d input dimensions if a neighborhood with a high local error is discovered. This results in the addition of less beneficial basis functions if not all input dimensions are equally important. Instead, we propose a heuristic method that identifies and introduces only new basis functions with the potentially highest error reduction. We found, that the average of the local error r at data points xi weighted with the evaluation of the basis function '(xi) in question is suited as a refinement indicator � � n 1 P n=1 '(xi)r(xi), as we then can use the heuristics from (Pfl10) to estimate potential error reduction. The number of data points n required for a significant approximation is in O(d). The algorithm has been tested with both synthetic and real world datasets from the UCI repository. We have found that our algorithm offers a 30 to 50 percent improvement in error decay over current methods while reducing the grid size between 40 to 75 percent, allowing us to tackle large, high dimensional datasets more efficiently. The accuracy re- mains comparable with current approaches.
Journal ArticleDOI
TL;DR: This work proposes a method to preserve high luminance information in a single-shot image by introducing a grid of highlight preserving pixels, which equals 1% of the total amount of pixels, and demonstrates that this number of pixels is enough for gaining additional dynamic range.
Abstract: Camera sensors are physically restricted in the amount of luminance which can be captured at once. To achieve a higher dynamic range, multiple exposures are typically combined. This method comes with several disadvantages, like temporal or alignment aliasing. Hence, we propose a method to preserve high luminance information in a single-shot image. By introducing a grid of highlight preserving pixels, which equals 1% of the total amount of pixels, we are able to sustain information directly incamera for later processing. To provide evidence, that this number of pixels is enough for gaining additional dynamic range, we use a U-Net for reconstruction. For training, we make use of the HDR+ dataset, which we augment to simulate our proposed grid. We demonstrate that our approach can preserve high luminance information, which can be used for a visually convincing reconstruction, close to the ground truth.
Journal ArticleDOI
TL;DR: An alternative approach to the direct polynomial chaos expansion is proposed in order to approximate one-dimensional uncertain field exhibiting steep fronts and is more accurate and less expensive than the direct approach since the regularity of the level points with respect to the input parameters allows achieving the convergence with low-order polyn coefficients.
Abstract: We propose an alternative approach to the direct polynomial chaos expansion in order to approximate one-dimensional uncertain field exhibiting steep fronts. The principle of our non-intrusive approach is to decompose the level points of the quantity of interest in order to avoid the spurious oscillations encountered in the direct approach. This method is more accurate and less expensive than the direct approach since the regularity of the level points with respect to the input parameters allows achieving the convergence with low-order polynomial series. The additional computational cost induced in the post-processing phase is largely offset by the use of low-level sparse grids that require a weak number of direct model evaluations in comparison with high-level sparse grids. We apply the method to subsurface flows problem with uncertain hydraulic conductivity. Infiltration test cases having different levels of complexity are presented.
Journal ArticleDOI
TL;DR: In this article, a Monte Carlo strategy is used to explore the effect that stochasticity in the parameters has on important features of the plasma boundary such as the location of the x-point, the strike points, and shaping attributes such as triangularity and elongation.

Network Information
Related Topics (5)
Discretization
53K papers, 1M citations
89% related
Iterative method
48.8K papers, 1.2M citations
83% related
Numerical analysis
52.2K papers, 1.2M citations
83% related
Partial differential equation
70.8K papers, 1.6M citations
82% related
Differential equation
88K papers, 2M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202314
202242
202157
202040
201960
201872