scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Various approaches for constructing an ensemble of metamodels using local measures

01 Dec 2010-Structural and Multidisciplinary Optimization (Springer-Verlag)-Vol. 42, Iss: 6, pp 879-896
TL;DR: This paper investigates the efficiency of using local error measures, and also presents the use of the pointwise cross validation error as a local error measure as an alternative to using prediction variance.
Abstract: Metamodels are approximate mathematical models used as surrogates for computationally expensive simulations. Since metamodels are widely used in design space exploration and optimization, there is growing interest in developing techniques to enhance their accuracy. It has been shown that the accuracy of metamodel predictions can be increased by combining individual metamodels in the form of an ensemble. Several efforts were focused on determining the contribution (or weight factor) of a metamodel in the ensemble using global error measures. In addition, prediction variance is also used as a local error measure to determine the weight factors. This paper investigates the efficiency of using local error measures, and also presents the use of the pointwise cross validation error as a local error measure as an alternative to using prediction variance. The effectiveness of ensemble models are tested on several problems with varying dimensionality: five mathematical benchmark problems, two structural mechanics problems and an automobile crash problem. It is found that the spatial ensemble models show better performances than the global ensemble for the low-dimensional problems, while the global ensemble is a more accurate model than the spatial ensembles for the high-dimensional problems. Ensembles based on pointwise cross validation error and prediction variance provide similar accuracy. The ensemble models based on local measures reduce cross validation errors drastically, but their performances are not that impressive in reducing the error evaluated at random test points, because the pointwise cross validation error is not a good surrogate for the error at a point.
Citations
More filters
Journal ArticleDOI
TL;DR: A comprehensive review of the important studies on design optimization for structural crashworthiness and energy absorption is provided in this article, where the authors provide some conclusions and recommendations to enable academia and industry to become more aware of the available capabilities and recent developments in design optimization.
Abstract: Optimization for structural crashworthiness and energy absorption has become an important topic of research attributable to its proven benefits to public safety and social economy. This paper provides a comprehensive review of the important studies on design optimization for structural crashworthiness and energy absorption. First, the design criteria used in crashworthiness and energy absorption are reviewed and the surrogate modeling to evaluate these criteria is discussed. Second, multiobjective optimization, optimization under uncertainties and topology optimization are reviewed from concepts, algorithms to applications in relation to crashworthiness. Third, the crashworthy structures are summarized, from generically novel structural configurations to industrial applications. Finally, some conclusions and recommendations are provided to enable academia and industry to become more aware of the available capabilities and recent developments in design optimization for structural crashworthiness and energy absorption.

295 citations

Journal ArticleDOI
TL;DR: The Adaptive Hybrid Functions (AHF) formulates a reliable Crowding Distance-Based Trust Region (CD-TR), and adaptively combines the favorable characteristics of different surrogate models to capture the global trend of the function as well as the local deviations.
Abstract: The determination of complex underlying relationships between system parameters from simulated and/or recorded data requires advanced interpolating functions, also known as surrogates. The development of surrogates for such complex relationships often requires the modeling of high dimensional and non-smooth functions using limited information. To this end, the hybrid surrogate modeling paradigm, where different surrogate models are combined, offers an effective solution. In this paper, we develop a new high fidelity surrogate modeling technique that we call the Adaptive Hybrid Functions (AHF). The AHF formulates a reliable Crowding Distance-Based Trust Region (CD-TR), and adaptively combines the favorable characteristics of different surrogate models. The weight of each contributing surrogate model is determined based on the local measure of accuracy for that surrogate model in the pertinent trust region. Such an approach is intended to exploit the advantages of each component surrogate. This approach seeks to simultaneously capture the global trend of the function as well as the local deviations. In this paper, the AHF combines four component surrogate models: (i) the Quadratic Response Surface Model (QRSM), (ii) the Radial Basis Functions (RBF), (iii) the Extended Radial Basis Functions (E-RBF), and (iv) the Kriging model. The AHF is applied to standard test problems and to a complex engineering design problem. Subsequent evaluations of the Root Mean Squared Error (RMSE) and the Maximum Absolute Error (MAE) illustrate the promising potential of this hybrid surrogate modeling approach.

116 citations


Cites background from "Various approaches for constructing..."

  • ...Acar (2010) investigated the efficiency of using various local error measures for constructing an ensemble of surrogate models, and also presented the use of the pointwise cross-validation error as a local error measure....

    [...]

  • ...More recently, researchers have presented the development of a combination of different approximate models into a single hybrid model for developing weighted average surrogates (Zerpa et al. 2005; Goel et al. 2007; Sanchez et al. 2008; Acar and Rais-Rohani 2009; Viana et al. 2009; Acar 2010)....

    [...]

Journal ArticleDOI
TL;DR: A multi-fidelity surrogate model based on radial basis function (MFS-RBF) is developed in this paper by combining HF and LF models and presents a better accuracy and robustness than the three benchmark MFS models and single-f fidelity surrogates.
Abstract: In computational simulation, a high-fidelity (HF) model is generally more accurate than a low-fidelity (LF) model, while the latter is generally more computationally efficient than the former. To take advantages of both HF and LF models, a multi-fidelity surrogate model based on radial basis function (MFS-RBF) is developed in this paper by combining HF and LF models. To determine the scaling factor between HF and LF models, a correlation matrix is augmented by further integrating LF responses. The scaling factor and relevant basis function weights are then calculated by employing corresponding HF responses. MFS-RBF is compared with Co-Kriging model, multi-fidelity surrogate based on linear regression (LR-MFS) model, CoRBF model, and three single-fidelity surrogates. The impact of key factors, such as the cost ratio of LF to HF models and different combinations of HF and LF samples, is also investigated. The results show that (i) MFS-RBF presents a better accuracy and robustness than the three benchmark MFS models and single-fidelity surrogates in about 90% cases of this paper; (ii) MFS-RBF is less sensitive to the correlation between HF and LF models than the three MFS models; (iii) by fixing the total computational cost, the cost ratio of LF to HF models is suggested to be less than 0.2, and 10–80% of the total cost should be used for LF samples; (iv) the MFS-RBF model is able to save an average of 50 to 70% computational cost if HF and LF models are highly correlated.

80 citations


Cites background from "Various approaches for constructing..."

  • ...…School ofMechanical Engineering, DalianUniversity of Technology, No.2 Linggong Road, Ganjingzi District, Dalian 116024, China 2 Department of Mechanical Engineering, The University of Texas at Dallas, Richardson, TX 75080, USA 2005; Acar and Rais-Rohani 2009; Acar 2010; Liu et al. 2016)....

    [...]

Journal ArticleDOI
TL;DR: The one-shot VF metamodeling process is transformed into an iterative process to utilize the already-acquired information of difference characteristics between the HF and LF models, and several numerical cases and a long cylinder pressure vessel optimization design problem verify the applicability of the proposed VF meetings.

70 citations

Journal ArticleDOI
TL;DR: The results indicated that the proposed ensemble surrogate model is an effective method to solve the inverse contaminant source identification problems with a high degree of accuracy and short computation time.

65 citations

References
More filters
Book
23 Nov 2005
TL;DR: The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Abstract: A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.

11,357 citations

Book
29 Aug 1995
TL;DR: Using a practical approach, this book discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques.
Abstract: From the Publisher: Using a practical approach, it discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques. Features numerous authentic application examples and problems. Illustrates how computers can be a useful aid in problem solving. Includes a disk containing computer programs for a response surface methodology simulation exercise and concerning mixtures.

10,104 citations

Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations

Book
01 Jan 1978

5,151 citations

Journal ArticleDOI
TL;DR: The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

2,152 citations


"Various approaches for constructing..." refers background in this paper

  • ...An extensive review of metamodeling can be found in Queipo et al. (2005) and Wang and Shan (2007)....

    [...]