Enabling High-Dimensional Hierarchical Uncertainty Quantification by ANOVA and Tensor-Train Decomposition
Reads0
Chats0
TLDR
This paper develops an efficient analysis of variance-based stochastic circuit/microelectromechanical systems simulator to efficiently extract the surrogate models at the low level and employs tensor-train decomposition at the high level to construct the basis functions and Gauss quadrature points.Abstract:
Hierarchical uncertainty quantification can reduce the computational cost of stochastic circuit simulation by employing spectral methods at different levels. This paper presents an efficient framework to simulate hierarchically some challenging stochastic circuits/systems that include high-dimensional subsystems. Due to the high parameter dimensionality, it is challenging to both extract surrogate models at the low level of the design hierarchy and to handle them in the high-level simulation. In this paper, we develop an efficient analysis of variance-based stochastic circuit/microelectromechanical systems simulator to efficiently extract the surrogate models at the low level. In order to avoid the curse of dimensionality, we employ tensor-train decomposition at the high level to construct the basis functions and Gauss quadrature points. As a demonstration, we verify our algorithm on a stochastic oscillator with four MEMS capacitors and 184 random parameters. This challenging example is efficiently simulated by our simulator at the cost of only 10min in MATLAB on a regular personal computer.read more
Citations
More filters
Calculation of Gauss quadrature rules.
Gene H. Golub,John H. Welsch +1 more
TL;DR: Two algorithms for generating the Gaussian quadrature rule defined by the weight function when: a) the three term recurrence relation is known for the orthogonal polynomials generated by $\omega$(t), and b) the moments of the weightfunction are known or can be calculated.
Posted Content
Tensorizing Neural Networks
TL;DR: This paper converts the dense weight matrices of the fully-connected layers to the Tensor Train format such that the number of parameters is reduced by a huge factor and at the same time the expressive power of the layer is preserved.
Journal ArticleDOI
Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1.
TL;DR: In this paper, the authors provide mathematical and graphical representations and interpretation of tensor networks, with the main focus on the Tucker and Tensor Train (TT) decompositions and their extensions or generalizations.
Proceedings Article
Tensorizing neural networks
TL;DR: In this paper, the authors converted the dense weight matrices of the fully-connected layers to the Tensor Train format such that the number of parameters is reduced by a huge factor and at the same time the expressive power of the layer is preserved.
References
More filters
Journal ArticleDOI
Tensor Decompositions and Applications
Tamara G. Kolda,Brett W. Bader +1 more
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Book
Stochastic Finite Elements: A Spectral Approach
Roger Ghanem,Pol D. Spanos +1 more
TL;DR: In this article, a representation of stochastic processes and response statistics are represented by finite element method and response representation, respectively, and numerical examples are provided for each of them.
Journal ArticleDOI
Analysis of individual differences in multidimensional scaling via an n-way generalization of 'eckart-young' decomposition
J. Douglas Carroll,Jih-Jie Chang +1 more
TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.
Journal ArticleDOI
The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
TL;DR: This work represents the stochastic processes with an optimum trial basis from the Askey family of orthogonal polynomials that reduces the dimensionality of the system and leads to exponential convergence of the error.
Journal ArticleDOI
A Multilinear Singular Value Decomposition
TL;DR: There is a strong analogy between several properties of the matrix and the higher-order tensor decomposition; uniqueness, link with the matrix eigenvalue decomposition, first-order perturbation effects, etc., are analyzed.