scispace - formally typeset
Search or ask a question
Topic

Surrogate model

About: Surrogate model is a research topic. Over the lifetime, 5019 publications have been published within this topic receiving 77441 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a deep neural network (DNN) is used to enforce the initial and boundary conditions, and the governing partial differential equations (i.e., Navier-Stokes equations) are incorporated into the loss of the DNN to drive the training.

341 citations

Journal ArticleDOI
TL;DR: Deep neural networks (DNN) are used to construct surrogate models for numerical simulators in a manner that lends the DNN surrogate the interpretation of recovering a low-dimensional nonlinear manifold.

340 citations

Journal ArticleDOI
TL;DR: The results indicate that the CORS-RBF algorithms are competitive with existing global optimization algorithms for costly functions on the box-constrained test problems and are better than other algorithms for constrained global optimization on the nonlinearly constrained test problem.
Abstract: We present a new strategy for the constrained global optimization of expensive black box functions using response surface models. A response surface model is simply a multivariate approximation of a continuous black box function which is used as a surrogate model for optimization in situations where function evaluations are computationally expensive. Prior global optimization methods that utilize response surface models were limited to box-constrained problems, but the new method can easily incorporate general nonlinear constraints. In the proposed method, which we refer to as the Constrained Optimization using Response Surfaces (CORS) Method, the next point for costly function evaluation is chosen to be the one that minimizes the current response surface model subject to the given constraints and to additional constraints that the point be of some distance from previously evaluated points. The distance requirement is allowed to cycle, starting from a high value (global search) and ending with a low value (local search). The purpose of the constraint is to drive the method towards unexplored regions of the domain and to prevent the premature convergence of the method to some point which may not even be a local minimizer of the black box function. The new method can be shown to converge to the global minimizer of any continuous function on a compact set regardless of the response surface model that is used. Finally, we considered two particular implementations of the CORS method which utilize a radial basis function model (CORS-RBF) and applied it on the box-constrained Dixon---Szego test functions and on a simple nonlinearly constrained test function. The results indicate that the CORS-RBF algorithms are competitive with existing global optimization algorithms for costly functions on the box-constrained test problems. The results also show that the CORS-RBF algorithms are better than other algorithms for constrained global optimization on the nonlinearly constrained test problem.

335 citations

Journal Article
TL;DR: A generic space-mapping optimization algorithm is formulated, explained step-by-step using a simple microstrip filter example, and its robustness is demonstrated through the fast design of an interdigital filter.
Abstract: In this article we review state-of-the-art concepts of space mapping and place them con- textually into the history of design optimization and modeling of microwave circuits. We formulate a generic space-mapping optimization algorithm, explain it step-by-step using a simple microstrip filter example, and then demonstrate its robustness through the fast design of an interdigital filter. Selected topics of space mapping are discussed, including implicit space mapping, gradient-based space mapping, the optimal choice of surrogate model, and tuning space mapping. We consider the application of space mapping to the modeling of microwave structures. We also discuss a software package for automated space-mapping optimization that involves both electromagnetic (EM) and circuit simulators.

327 citations

Journal ArticleDOI
TL;DR: Automated learning of algebraic models for optimization (ALAMO), the computational implementation of the proposed methodology, along with examples and extensive computational comparisons between ALAMO and a variety of machine learning techniques, including Latin hypercube sampling, simple least-squares regression, and the lasso are described.
Abstract: A central problem in modeling, namely that of learning an algebraic model from data obtained from simulations or experiments is addressed. A methodology that uses a small number of simulations or experiments to learn models that are as accurate and as simple as possible is proposed. The approach begins by building a low-complexity surrogate model. The model is built using a best subset technique that leverages an integer programming formulation to allow for the efficient consideration of a large number of possible functional components in the model. The model is then improved systematically through the use of derivative-free optimization solvers to adaptively sample new simulation or experimental points. Automated learning of algebraic models for optimization (ALAMO), the computational implementation of the proposed methodology, along with examples and extensive computational comparisons between ALAMO and a variety of machine learning techniques, including Latin hypercube sampling, simple least-squares regression, and the lasso is described. © 2014 American Institute of Chemical Engineers AIChE J, 60: 2211–2227, 2014

325 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
82% related
Finite element method
178.6K papers, 3M citations
79% related
Robustness (computer science)
94.7K papers, 1.6M citations
79% related
Artificial neural network
207K papers, 4.5M citations
78% related
Support vector machine
73.6K papers, 1.7M citations
76% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023528
2022981
2021840
2020729
2019547
2018458