scispace - formally typeset
Search or ask a question
Topic

Sparse grid

About: Sparse grid is a research topic. Over the lifetime, 1013 publications have been published within this topic receiving 20664 citations.


Papers
More filters
Book Chapter
09 Aug 2015
TL;DR: The paper will present a standard Polynomial Chaos Expansion, an Uncertain Quantification-High Dimensional Model Representation, a Generalised Kriging model and an expansion with Tchebycheff polynomials on sparse grids to assess the computational cost and the suitability of these methods to propagate different type of orbits.
Abstract: The paper presents four different non-intrusive approaches to the propagation of uncertainty in orbital dynamics with particular application to space debris orbit analysis. Intrusive approaches are generally understood as those methods that require a modification of the original problem by introducing a new algebra or by directly embedding high-order polynomial expansions of the uncertain quantities in the governing equations. Non-intrusive approaches are instead based on a polynomial representations built on sparse samples of the system response to the uncertain quantities. The paper will present a standard Polynomial Chaos Expansion, an Uncertain Quantification-High Dimensional Model Representation, a Generalised Kriging model and an expansion with Tchebycheff polynomials on sparse grids. The work will assess the computational cost and the suitability of these methods to propagate different type of orbits.

11 citations

Journal ArticleDOI
TL;DR: In this paper, a sparse grid collocation method with a time discretiza- tion of the differential equations for computing expectations of functionals of solutions to differential equations perturbed by time-dependent white noise is proposed.
Abstract: We consider a sparse grid collocation method in conjunction with a time discretiza- tion of the differential equations for computing expectations of functionals of solutions to differential equations perturbed by time-dependent white noise. We first analyze the error of Smolyak's sparse grid collocation used to evaluate expectations of functionals of solutions to stochastic differential equations discretized by the Euler scheme. We show theoretically and numerically that this algo- rithm can have satisfactory accuracy for small noise magnitude or small integration time, however it does not converge either with decrease of the Euler scheme's time step size or with increase of Smolyak's sparse grid level. Subsequently, we use this method as a building block for proposing a new algorithm by combining sparse grid collocation with a recursive procedure. This approach allows us to numerically integrate linear stochastic partial differential equations over longer times, which is illustrated in numerical tests on a stochastic advection-diffusion equation.

11 citations

Proceedings ArticleDOI
10 Jun 2009
TL;DR: Sparse grid-based identification is extended to include a novel parameter robustness analysis method that can be applied to any type of quantitative model, unlike commonly used stochastic methods and most deterministic algorithms.
Abstract: A major limiting step in the creation of systems biology models is the determination of appropriate parameter values that fit available experimental data. Parameter identification is hindered by the experimental difficulties in examining biological systems and the growing size and complexity of nonlinear models. In addition, the majority of systems biology models are ‘sloppy,’ allowing many parameter sets to fit the data. Typically, these sets are only distinguished by their quantitative fit, with the goal to minimize the least square error between simulation and data. Instead of this single-minded focus on error, parameter sets can also be distinguished by the model's relative robustness to parameter changes with that set. Robustness of a model in general has been explored, but choosing model parameters based on relative robustness is fairly new. This choice is reasonable both from the biological perspective, in that a system would be more resistant to mutations with robust parameters, and from the modeling prospective, in that robust parameters could allow easier re-fitting of the model to new data. A sparse grid-based parameter identification method has been recently developed for nonlinear models with large uncertain parameter spaces. Sparse grid parameter identification has the added benefit of storing information about the entire global parameter space, unlike commonly used stochastic methods and most deterministic algorithms. This information can be exploited for a robustness analysis that requires no additional model simulations or manipulation of the model equations. Herein, sparse grid-based identification is extended to include a novel parameter robustness analysis method that can be applied to any type of quantitative model.

11 citations

BookDOI
01 Jan 2014
TL;DR: This work focuses on Adaptive Low-Rank Approximation Techniques in the Hierarchical Tensor Format, which has applications in Reinforcement Learning and Nonlinear Eigenproblems in Data Analysis.
Abstract: D. Belomestny, C. Bender, F. Dickmann, and N. Schweizer: Solving Stochastic Dynamic Programs by Convex Optimization and Simulation.- W. Dahmen, C. Huang, G. Kutyniok, W.-Q Lim, C. Schwab, and G. Welper: Efficient Resolution of Anisotropic Structures.- R. Ressel, P. Dulk, S. Dahlke, K. S. Kazimierski, and P. Maass: Regularity of the Parameter-to-state Map of a Parabolic Partial Differential Equation.- N. Chegini, S. Dahlke, U. Friedrich, and R. Stevenson: Piecewise Tensor Product Wavelet Bases by Extensions and Approximation Rates.- P. A. Cioica, S. Dahlke, N. Dohring, S. Kinzel, F. Lindner, T. Raasch, K. Ritter, and R. Schilling: Adaptive Wavelet Methods for SPDEs.- M. Altmayer, S. Dereich, S. Li, T. Muller-Gronbach, A. Neuenkirch, K. Ritter and L. Yaroslavtseva: Constructive Quantization and Multilevel Algorithms for Quadrature of Stochastic Differential Equations.- O. G. Ernst, B. Sprungk, and H.-J. Starkloff: Bayesian Inverse Problems and Kalman Filters.- J. Diehl, P. Friz, H. Mai, H. Oberhauser, S. Riedel, and W. Stannat: Robustness in Stochastic Filtering and Maximum Likelihood Estimation for SDEs.- J. Garcke and I. Klompmaker: Adaptive Sparse Grids in Reinforcement Learning.- J. Ballani, L. Grasedyck, and M. Kluge: A Review on Adaptive Low-Rank Approximation Techniques in the Hierarchical Tensor Format.- M. Griebel, J. Hamaekers, and F. Heber: A Bond Order Dissection ANOVA Approach for Efficient Electronic Structure Calculations.- W. Hackbusch and R. Schneider: Tensor Spaces and Hierarchical Tensor Representations.- L. Jost, S. Setzer, and M. Hein: Nonlinear Eigenproblems in Data Analysis - Balanced Graph Cuts and the Ratio DCA-Prox.- M. Guillemard, D. Heinen, A. Iske, S. Krause-Solberg, and G. Plonka: Adaptive Approximation Algorithms for Sparse Data Representation.- T. Jahnke and V. Sunkara: Error Bound for Hybrid Models of Two-scaled Stochastic Reaction Systems.- R. Kiesel, A. Rupp, and K. Urban: Valuation of Structured Financial Products by Adaptive Multi wavelet Methods in High Dimensions.- L Kammerer, S. Kunis, I. Melzer, D. Potts, and T. Volkmer: Computational Methods for the Fourier Analysis of Sparse High-Dimensional Functions.- E. Herrholz, D. Lorenz, G. Teschke, and D. Trede: Sparsity and Compressed Sensing in Inverse Problems.- C. Lubich: Low-Rank Dynamics.- E. Novak and D. Rudolf: Computation of Expectations by Markov Chain Monte Carlo Methods.- H. Yserentant: Regularity, Complexity, and Approximability of Electronic Wave functions.- Index.

11 citations

Proceedings Article
29 Apr 2013
TL;DR: It is shown that the costs for exact GPR can be reduced to a sub-quadratic function of N, and this work extends these exact fast algorithms to sparse GPR and remark on a connection to Gaussian process latent variable models (GPLVMs).
Abstract: Gaussian process regression (GPR) is a powerful non-linear technique for Bayesian inference and prediction. One drawback is its O(N) computational complexity for both prediction and hyperparameter estimation for N input points which has led to much work in sparse GPR methods. In case that the covariance function is expressible as a tensor product kernel (TPK) and the inputs form a multidimensional grid, it was shown that the costs for exact GPR can be reduced to a sub-quadratic function of N . We extend these exact fast algorithms to sparse GPR and remark on a connection to Gaussian process latent variable models (GPLVMs). In practice, the inputs may also violate the multidimensional grid constraints so we pose and efficiently solve missing and extra data problems for both exact and sparse grid GPR. We demonstrate our method on synthetic, text scan, and magnetic resonance imaging (MRI) data reconstructions.

11 citations


Network Information
Related Topics (5)
Discretization
53K papers, 1M citations
89% related
Iterative method
48.8K papers, 1.2M citations
83% related
Numerical analysis
52.2K papers, 1.2M citations
83% related
Partial differential equation
70.8K papers, 1.6M citations
82% related
Differential equation
88K papers, 2M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202314
202242
202157
202040
201960
201872