scispace - formally typeset
Search or ask a question
Topic

High-dimensional model representation

About: High-dimensional model representation is a research topic. Over the lifetime, 289 publications have been published within this topic receiving 6901 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a family of multivariate representations is introduced to capture the input-output relationships of high-dimensional physical systems with many input variables and a systematic mapping procedure between the inputs and outputs is prescribed to reveal the hierarchy of correlations amongst the input variables.
Abstract: A family of multivariate representations is introduced to capture the input–output relationships of high‐dimensional physical systems with many input variables. A systematic mapping procedure between the inputs and outputs is prescribed to reveal the hierarchy of correlations amongst the input variables. It is argued that for most well‐defined physical systems, only relatively low‐order correlations of the input variables are expected to have an impact upon the output. The high‐dimensional model representations (HDMR) utilize this property to present an exact hierarchical representation of the physical system. At each new level of HDMR, higher‐order correlated effects of the input variables are introduced. Tests on several systems indicate that the few lowest‐order terms are often sufficient to represent the model in equivalent form to good accuracy. The input variables may be either finite‐dimensional (i.e., a vector of parameters chosen from the Euclidean space $$\mathcal{R}^n$$ ) or may be infinite‐dimensional as in the function space $${\text{C}}^n \left[ {0,1} \right]$$ . Each hierarchical level of HDMR is obtained by applying a suitable projection operator to the output function and each of these levels are orthogonal to each other with respect to an appropriately defined inner product. A family of HDMRs may be generated with each having distinct character by the use of different choices of projection operators. Two types of HDMRs are illustrated in the paper: ANOVA‐HDMR is the same as the analysis of variance (ANOVA) decomposition used in statistics. Another cut‐HDMR will be shown to be computationally more efficient than the ANOVA decomposition. Application of the HDMR tools can dramatically reduce the computational effort needed in representing the input–output relationships of a physical system. In addition, the hierarchy of identified correlation functions can provide valuable insight into the model structure. The notion of a model in the paper also encompasses input–output relationships developed with laboratory experiments, and the HDMR concepts are equally applicable in this domain. HDMRs can be classified as non‐regressive, non‐parametric learning networks. Selected applications of the HDMR concept are presented along with a discussion of its general utility.

826 citations

Journal ArticleDOI
TL;DR: This paper considers an emerging family of high dimensional model representation concepts and techniques capable of dealing with large numbers of input variables, typically a nonlinear relationship.
Abstract: In the chemical sciences, many laboratory experiments, environmental and industrial processes, as well as modeling exercises, are characterized by large numbers of input variables. A general objective in such cases is an exploration of the high-dimensional input variable space as thoroughly as possible for its impact on observable system behavior, often with either optimization in mind or simply for achieving a better understanding of the phenomena involved. An important concern when undertaking these explorations is the number of experiments or modeling excursions necessary to effectively learn the system input → output behavior, which is typically a nonlinear relationship. Although simple logic suggests that the number of runs could grow exponentially with the number of input variables, broadscale evidence indicates that the required effort often scales far more comfortably. This paper considers an emerging family of high dimensional model representation concepts and techniques capable of dealing with s...

483 citations

Journal ArticleDOI
TL;DR: Mathematical models described by multivariable functions f(x) where x=(x1,…,xn) are investigated, and an attempt can be made to construct a low order approximation to the model using values of f( x) only.

415 citations

Journal ArticleDOI
TL;DR: It is shown in an example that judicious use of orthonormal polynomials can provide a sampling saving, regardless of the dimension of the input variable space, in a quantitative model assessment and analysis tool.
Abstract: A general set of quantitative model assessment and analysis tools, termed high-dimensional model representations (HDMR), has been introduced recently for improving the efficiency of deducing high-dimensional input−output system behavior. HDMR is a particular family of representations where each term in the representation reflects the independent and cooperative contributions of the inputs upon the output. When data are randomly sampled, a RS (random sampling)-HDMR can be constructed. To reduce the sampling effort, different analytical basis functions, such as orthonormal polynomials, cubic B splines, and polynomials may be employed to approximate the RS-HDMR component functions. Only one set of random input−output samples is necessary to determine all the RS-HDMR component functions, and a few hundred samples may give a satisfactory approximation, regardless of the dimension of the input variable space. It is shown in an example that judicious use of orthonormal polynomials can provide a sampling saving o...

273 citations

Journal ArticleDOI
TL;DR: It is argued that the number of samples needed for representation to a given tolerance is invariant to the dimensionality of the function, thereby providing for a very efficient means to perform high dimensional interpolation.
Abstract: Physical models of various phenomena are often represented by a mathematical model where the output(s) of interest have a multivariate dependence on the inputs Frequently, the underlying laws governing this dependence are not known and one has to interpolate the mathematical model from a finite number of output samples Multivariate approximation is normally viewed as suffering from the curse of dimensionality as the number of sample points needed to learn the function to a sufficient accuracy increases exponentially with the dimensionality of the function However, the outputs of most physical systems are mathematically well behaved and the scarcity of the data is usually compensated for by additional assumptions on the function (ie, imposition of smoothness conditions or confinement to a specific function space) High dimensional model representations (HDMR) are a particular family of representations where each term in the representation reflects the individual or cooperative contributions of the inputs upon the output The main assumption of this paper is that for most well defined physical systems the output can be approximated by the sum of these hierarchical functions whose dimensionality is much smaller than the dimensionality of the output This ansatz can dramatically reduce the sampling effort in representing the multivariate function HDMR has a variety of applications where an efficient representation of multivariate functions arise with scarce data The formulation of HDMR in this paper assumes that the data is randomly scattered throughout the domain of the output Under these conditions and the assumptions underlying the HDMR it is argued that the number of samples needed for representation to a given tolerance is invariant to the dimensionality of the function, thereby providing for a very efficient means to perform high dimensional interpolation Selected applications of HDMR's are presented from sensitivity analysis and time-series analysis

258 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
75% related
Finite element method
178.6K papers, 3M citations
73% related
Partial differential equation
70.8K papers, 1.6M citations
73% related
Boundary value problem
145.3K papers, 2.7M citations
71% related
Reynolds number
68.4K papers, 1.6M citations
71% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
202110
202014
201917
20189
201724