scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Review of Metamodeling Techniques in Support of Engineering Design Optimization

01 Jan 2006-Vol. 129, Iss: 4, pp 370-380
TL;DR: This work reviews the state-of-the-art metamodel-based techniques from a practitioner's perspective according to the role of meetamodeling in supporting design optimization, including model approximation, design space exploration, problem formulation, and solving various types of optimization problems.
Abstract: Computation-intensive design problems are becoming increasingly common in manufacturing industries. The computation burden is often caused by expensive analysis and simulation processes in order to reach a comparable level of accuracy as physical testing data. To address such a challenge, approximation or metamodeling techniques are often used. Metamodeling techniques have been developed from many different disciplines including statistics, mathematics, computer science, and various engineering disciplines. These metamodels are initially developed as “surrogates” of the expensive simulation process in order to improve the overall computation efficiency. They are then found to be a valuable tool to support a wide scope of activities in modern engineering design, especially design optimization. This work reviews the state-of-the-art metamodel-based techniques from a practitioner’s perspective according to the role of metamodeling in supporting design optimization, including model approximation, design space exploration, problem formulation, and solving various types of optimization problems. Challenges and future development of metamodeling in support of engineering design is also analyzed and discussed.Copyright © 2006 by ASME
Citations
More filters
Journal ArticleDOI
TL;DR: This paper compares Maximum Likelihood Estimation (MLE) and Cross-Validation (CV) parameter estimation methods for selecting a kriging model’s parameters given its form and and an R 2 of prediction and the corrected Akaike Information Criterion for assessing the quality of the created kriged model, permitting the comparison of different forms of a k Riging model.
Abstract: The use of kriging models for approximation and metamodel-based design and optimization has been steadily on the rise in the past decade. The widespread usage of kriging models appears to be hampered by (1) the lack of guidance in selecting the appropriate form of the kriging model, (2) computationally efficient algorithms for estimating the model’s parameters, and (3) an effective method to assess the resulting model’s quality. In this paper, we compare (1) Maximum Likelihood Estimation (MLE) and Cross-Validation (CV) parameter estimation methods for selecting a kriging model’s parameters given its form and (2) and an R 2 of prediction and the corrected Akaike Information Criterion for assessing the quality of the created kriging model, permitting the comparison of different forms of a kriging model. These methods are demonstrated with six test problems. Finally, different forms of kriging models are examined to determine if more complex forms are more accurate and easier to fit than simple forms of kriging models for approximating computer models.

833 citations

Journal ArticleDOI
TL;DR: Two broad families of surrogates namely response surface surrogates, which are statistical or empirical data‐driven models emulating the high‐fidelity model responses, and lower‐f fidelity physically based surrogates which are simplified models of the original system are detailed in this paper.
Abstract: [1] Surrogate modeling, also called metamodeling, has evolved and been extensively used over the past decades. A wide variety of methods and tools have been introduced for surrogate modeling aiming to develop and utilize computationally more efficient surrogates of high-fidelity models mostly in optimization frameworks. This paper reviews, analyzes, and categorizes research efforts on surrogate modeling and applications with an emphasis on the research accomplished in the water resources field. The review analyzes 48 references on surrogate modeling arising from water resources and also screens out more than 100 references from the broader research community. Two broad families of surrogates namely response surface surrogates, which are statistical or empirical data-driven models emulating the high-fidelity model responses, and lower-fidelity physically based surrogates, which are simplified models of the original system, are detailed in this paper. Taxonomies on surrogate modeling frameworks, practical details, advances, challenges, and limitations are outlined. Important observations and some guidance for surrogate modeling decisions are provided along with a list of important future research directions that would benefit the common sampling and search (optimization) analyses found in water resources.

663 citations

Journal ArticleDOI
TL;DR: A comprehensive review on bilevel optimization from the basic principles to solution strategies is provided in this paper, where a number of potential application problems are also discussed and an automated text-analysis of an extended list of papers has been performed.
Abstract: Bilevel optimization is defined as a mathematical program, where an optimization problem contains another optimization problem as a constraint. These problems have received significant attention from the mathematical programming community. Only limited work exists on bilevel problems using evolutionary computation techniques; however, recently there has been an increasing interest due to the proliferation of practical applications and the potential of evolutionary algorithms in tackling these problems. This paper provides a comprehensive review on bilevel optimization from the basic principles to solution strategies; both classical and evolutionary. A number of potential application problems are also discussed. To offer the readers insights on the prominent developments in the field of bilevel optimization, we have performed an automated text-analysis of an extended list of papers published on bilevel optimization to date. This paper should motivate evolutionary computation researchers to pay more attention to this practical yet challenging area.

588 citations

Journal ArticleDOI
TL;DR: The goal of this paper is to review both the understanding of the field and the support tools that exist for the purpose, and identify the trends and possible directions research can evolve in the future.
Abstract: Product design is a highly involved, often ill-defined, complex and iterative process, and the needs and specifications of the required artifact get more refined only as the design process moves toward its goal. An effective computer support tool that helps the designer make better-informed decisions requires efficient knowledge representation schemes. In today's world, there is a virtual explosion in the amount of raw data available to the designer, and knowledge representation is critical in order to sift through this data and make sense of it. In addition, the need to stay competitive has shrunk product development time through the use of simultaneous and collaborative design processes, which depend on effective transfer of knowledge between teams. Finally, the awareness that decisions made early in the design process have a higher impact in terms of energy, cost, and sustainability, has resulted in the need to project knowledge typically required in the later stages of design to the earlier stages. Research in design rationale systems, product families, systems engineering, and ontology engineering has sought to capture knowledge from earlier product design decisions, from the breakdown of product functions and associated physical features, and from customer requirements and feedback reports. VR (Virtual reality) systems and multidisciplinary modeling have enabled the simulation of scenarios in the manufacture, assembly, and use of the product. This has helped capture vital knowledge from these stages of the product life and use it in design validation and testing. While there have been considerable and significant developments in knowledge capture and representation in product design, it is useful to sometimes review our position in the area, study the evolution of research in product design, and from past and current trends, try and foresee future developments. The goal of this paper is thus to review both our understanding of the field and the support tools that exist for the purpose, and identify the trends and possible directions research can evolve in the future.

583 citations

Journal ArticleDOI
TL;DR: A survey on related modeling and optimization strategies that may help to solve High-dimensional, Expensive (computationally), Black-box (HEB) problems and two promising approaches are identified to solve HEB problems.
Abstract: The integration of optimization methodologies with computational analyses/simulations has a profound impact on the product design. Such integration, however, faces multiple challenges. The most eminent challenges arise from high-dimensionality of problems, computationally-expensive analysis/simulation, and unknown function properties (i.e., black-box functions). The merger of these three challenges severely aggravates the difficulty and becomes a major hurdle for design optimization. This paper provides a survey on related modeling and optimization strategies that may help to solve High-dimensional, Expensive (computationally), Black-box (HEB) problems. The survey screens out 207 references including multiple historical reviews on relevant subjects from more than 1,000 papers in a variety of disciplines. This survey has been performed in three areas: strategies tackling high-dimensionality of problems, model approximation techniques, and direct optimization strategies for computationally-expensive black-box functions and promising ideas behind non-gradient optimization algorithms. Major contributions in each area are discussed and presented in an organized manner. The survey exposes that direct modeling and optimization strategies to address HEB problems are scarce and sporadic, partially due to the difficulty of the problem itself. Moreover, it is revealed that current modeling research tends to focus on sampling and modeling techniques themselves and neglect studying and taking the advantages of characteristics of the underlying expensive functions. Based on the survey results, two promising approaches are identified to solve HEB problems. Directions for future research are also discussed.

535 citations

References
More filters
Book
29 Aug 1995
TL;DR: Using a practical approach, this book discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques.
Abstract: From the Publisher: Using a practical approach, it discusses two-level factorial and fractional factorial designs, several aspects of empirical modeling with regression techniques, focusing on response surface methodology, mixture experiments and robust design techniques. Features numerous authentic application examples and problems. Illustrates how computers can be a useful aid in problem solving. Includes a disk containing computer programs for a response surface methodology simulation exercise and concerning mixtures.

10,104 citations


"Review of Metamodeling Techniques i..." refers background or methods in this paper

  • ...This approach is commonly seen in literature [1, 8, 114]....

    [...]

  • ...Widely used “classic” experimental designs include factorial or fractional factorial [8], central composite design (CCD) [8, 9], Box-Behnken [8], alphabetical optimal [10, 11], and Plackett-Burman designs [8]....

    [...]

Journal ArticleDOI
TL;DR: In this paper, two sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies and they are shown to be improvements over simple sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.
Abstract: Two types of sampling plans are examined as alternatives to simple random sampling in Monte Carlo studies. These plans are shown to be improvements over simple random sampling with respect to variance for a class of estimators which includes the sample mean and the empirical distribution function.

8,328 citations

Journal ArticleDOI
TL;DR: This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.
Abstract: In many engineering optimization problems, the number of function evaluations is severely limited by time or cost. These problems pose a special challenge to the field of global optimization, since existing methods often require more function evaluations than can be comfortably afforded. One way to address this challenge is to fit response surfaces to data collected by evaluating the objective and constraint functions at a few points. These surfaces can then be used for visualization, tradeoff analysis, and optimization. In this paper, we introduce the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering. We then show how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule. The key to using response surfaces for global optimization lies in balancing the need to exploit the approximating surface (by sampling where it is minimized) with the need to improve the approximation (by sampling where prediction error may be high). Striking this balance requires solving certain auxiliary problems which have previously been considered intractable, but we show how these computational obstacles can be overcome.

6,914 citations


"Review of Metamodeling Techniques i..." refers methods in this paper

  • ...[119, 123], where the authors applied the Bayesian method to estimate...

    [...]

Journal ArticleDOI
TL;DR: In this article, a new method is presented for flexible regression modeling of high dimensional data, which takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data.
Abstract: A new method is presented for flexible regression modeling of high dimensional data. The model takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data. This procedure is motivated by the recursive partitioning approach to regression and shares its attractive properties. Unlike recursive partitioning, however, this method produces continuous models with continuous derivatives. It has more power and flexibility to model relationships that are nearly additive or involve interactions in at most a few variables. In addition, the model can be represented in a form that separately identifies the additive contributions and those associated with the different multivariable interactions.

6,651 citations

Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations