scispace - formally typeset
Search or ask a question

Showing papers on "Surrogate model published in 1997"


Journal ArticleDOI
TL;DR: A Bayesian-validated surrogate framework which permits economical and reliable integration of large-scale simulations into engineering design and optimization and several illustrative applications in heat transfer and fluid mechanics are discussed.

21 citations


Dissertation
01 Jan 1997
TL;DR: A systematic computer-aided optimal design decision process is developed which allows the designer to rapidly evaluate and improve a proposed design by taking into account the major factors of interest related to different aspects such as design, construction, and operation.
Abstract: A general framework for multi-criteria optimal design is presented which is well-suited for automated design of structural systems. A systematic computer-aided optimal design decision process is developed which allows the designer to rapidly evaluate and improve a proposed design by taking into account the major factors of interest related to different aspects such as design, construction, and operation. The proposed optimal design process requires the selection of the most promising choice of design parameters taken from a large design space, based on an evaluation using specified criteria. The design parameters specify a particular design, and so they relate to member sizes, structural configuration, etc. The evaluation of the design uses performance parameters which may include structural response parameters, risks due to uncertain loads and modeling errors, construction and operating costs, etc. Preference functions are used to implement the design criteria in a "soft" form. These preference functions give a measure of the degree of satisfaction of each design criterion. The overall evaluation measure for a design is built up from the individual measures for each criterion through a preference combination rule. The goal of the optimal design process is to obtain a design that has the highest overall evaluation measure - an optimization problem. Genetic algorithms are stochastic optimization methods that are based on evolutionary theory. They provide the exploration power necessary to explore high-dimensional search spaces to seek these optimal solutions. Two special genetic algorithms, hGA and vGA, are presented here for continuous and discrete optimization problems, respectively. The methodology is demonstrated with several examples involving the design of truss and frame systems. These examples are solved by using the proposed hGA and vGA.

14 citations


01 Jan 1997
TL;DR: Advances in three directions based on developments in the theory of discrete event systems, Concurrent Simulation enables the extraction of information from a single simulation that would otherwise require multiple repeated simulations, Hierarchical Simulation provides yet another means for speedup and combat simulation is included.
Abstract: Simulation of large complex systems for the purpose of evaluating performance and exploring alternatives is a computationally slow process, currently still out of the domain of real-time applications. This paper overviews advances in three directions aimed at overcoming this limitation. First, based on developments in the theory of discrete event systems, Concurrent Simulation enables the extraction of information from a single simulation that would otherwise require multiple repeated simulations. This effectively provides simulation speedups of possibly orders of magnitude. A second direction attempts to use simulation for the purpose ol obtaining a metamodel of the actual system, i.e., an approximate surrogate model which is computationally very fast, yet accurate. We will specifically discuss the use of Neural Networks as metamodeling devices which may be trained through simulation. Finally, Hierarchical Simulation provides yet another means for speedup, a major challenge being the preservation of fidelity between hierarchical levels. In practice, using the statistical average of a high resolution level simulator output as the input for a lower resolution level causes significant loss of stochastic fidelity. We will present an approach in which we cluster the high resolution simulation output into path bundles as the input for the lower resolution level. The paper includes applications of these new directions to areas such as combat simulation and design of C3I systems.

5 citations


Journal ArticleDOI
TL;DR: This paper presents a polynomial induction network (PIN) methodology that will support a designer in problems of this sort, where design variables are numerous and simulations are expensive.
Abstract: SUMMARY A key aspect of design considers the effects of large groups of design variables on multiple measures of system performance or responses. Thus, a goal of design-aiding systems is to learn the contributions of each design variable to the response variables. This problem is particularly difficult when simulation runs are expensive in either time or money because conducting exhaustive searches over the design space is not possible. This situation occurs quite frequently for large-scale design problems where the number of design variables is large and their relationships to the response variables are not well understood. In this paper, we present a polynomial induction network (PIN) methodology that will support a designer in problems of this sort, where design variables are numerous and simulations are expensive. Our description shows how this approach was implemented in software and used to actually design a multi-processor system. Based on this experience, we conclude that PINs have excellent promi...

2 citations