scispace - formally typeset
Search or ask a question
Author

Jin Zhang

Bio: Jin Zhang is an academic researcher from Tilburg University. The author has contributed to research in topics: Design specification & Engineering design process. The author has an hindex of 1, co-authored 3 publications receiving 22 citations. Previous affiliations of Jin Zhang include Centre national de la recherche scientifique.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a methodology combines simulation, bootstrapping, and metamodeling to estimate which uncertain environmental parameters are important (so managers can become pro-active) and which parameter combinations (scenarios) make the design unacceptable.
Abstract: Managers wish to verify that a particular engineering design meets their requirements. This design's future environment will differ from the environment assumed during the design. Therefore it is crucial to determine which variations in the environment may make this design unacceptable. The proposed methodology estimates which uncertain environmental parameters are important (so managers can become pro-active) and which parameter combinations (scenarios) make the design unacceptable. The methodology combines simulation, bootstrapping, and metamodeling. The methodology is illustrated through a simulated manufacturing system, including fourteen uncertain parameters of the input distributions for the various arrival and service times. These parameters are investigated through sixteen scenarios, selected through a two-level fractional-factorial design. The resulting simulation Input/Output (I/O) data are analyzed through a first-order polynomial metamodel and bootstrapping. A second experiment gives some outputs that are indeed unacceptable. Polynomials fitted to the I/O data estimate the border line (frontier) between acceptable and unacceptable environments.

20 citations

Posted Content
TL;DR: An approach is suggested that allows determining which uncertain parameters are important and which combinations of these parameters can lead to an unacceptable design, which combines several methods, namely, simulation, bootstrapping, and metamodeling.
Abstract: In the design of a manufacturing system, the design specification is often suggested by a design team. Managers are interested in verifying that this specification will satisfy the production requirements. Because the future production environment will likely differ from the one assumed, it is important to determine in which situations the suggested design becomes unacceptable. This paper suggests an approach that allows determining which uncertain parameters are important and which combinations of these parameters can lead to an unacceptable design. This approach combines several methods, namely, simulation, bootstrapping, and metamodeling. The methodology is explained and illustrated through a stochastic simulated manufacturing system, which includes uncertain parameters related to the arrival and the processing times of jobs. This example shows the conditions under which the system does not meet the requirements.

1 citations

Proceedings ArticleDOI
06 Jul 2009
TL;DR: In this paper, an approach that allows determining which uncertain parameters are important and which combinations of these parameters can lead to an unacceptable design is presented. But the approach is limited to a stochastic simulated manufacturing system, which includes uncertain parameters related to the arrival and processing times of jobs.
Abstract: In the design of a manufacturing system, the design specification is often suggested by a design team. Managers are interested in verifying that this specification will satisfy the production requirements. Because the future production environment will likely differ from the one assumed, it is important to determine in which situations the suggested design becomes unacceptable. This paper suggests an approach that allows determining which uncertain parameters are important and which combinations of these parameters can lead to an unacceptable design. This approach combines several methods, namely, simulation, bootstrapping, and metamodeling. The methodology is explained and illustrated through a stochastic simulated manufacturing system, which includes uncertain parameters related to the arrival and the processing times of jobs. This example shows the conditions under which the system does not meet the requirements.

1 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This article reviews the design and analysis of simulation experiments and focuses on analysis via two types of metamodel, namely, low-order polynomial regression, and Kriging (or Gaussian process).
Abstract: This article reviews the design and analysis of simulation experiments. It focusses on analysis via either low-order polynomial regression or Kriging (also known as Gaussian process) metamodels. The type of metamodel determines the design of the experiment, which determines the input combinations of the simulation experiment. For example, a first-order polynomial metamodel requires a "resolution-III" design, whereas Kriging may use Latin hypercube sampling. Polynomials of first or second order require resolution III, IV, V, or "central composite" designs. Before applying either regression or Kriging, sequential bifurcation may be applied to screen a great many inputs. Optimization of the simulated system may use either a sequence of low-order polynomials known as response surface methodology (RSM) or Kriging models fitted through sequential designs including efficient global optimization (EGO). The review includes robust optimization, which accounts for uncertain simulation inputs.

205 citations

Journal ArticleDOI
TL;DR: The proposed multi-objective robust simulation optimization approach can be efficiently used for developing policy suggestions and for improving decision support for policymakers and is possible to apply in dynamically complex and deeply uncertain systems such as public health, financial systems, transportation, and housing.

99 citations

Journal ArticleDOI
TL;DR: This article used conditional Monte Carlo to derive a consistent quantile sensitivity estimator that improves upon these convergence rates and requires no batching or binning, and illustrate the new estimator using a simple but realistic portfolio credit risk example, for which the previous work is inapplicable.
Abstract: Estimating quantile sensitivities is important in many optimization applications, from hedging in financial engineering to service-level constraints in inventory control to more general chance constraints in stochastic programming. Recently, Hong (Hong, L. J. 2009. Estimating quantile sensitivities. Oper. Res.57 118--130) derived a batched infinitesimal perturbation analysis estimator for quantile sensitivities, and Liu and Hong (Liu, G., L. J. Hong. 2009. Kernel estimation of quantile sensitivities. Naval Res. Logist.56 511--525) derived a kernel estimator. Both of these estimators are consistent with convergence rates bounded by n-1/3 and n-2/5, respectively. In this paper, we use conditional Monte Carlo to derive a consistent quantile sensitivity estimator that improves upon these convergence rates and requires no batching or binning. We illustrate the new estimator using a simple but realistic portfolio credit risk example, for which the previous work is inapplicable.

78 citations

Posted Content
TL;DR: This work develops a “robust” methodology that accounts for uncertain environments using Taguchi's view of the uncertain world but replaces his statistical techniques by design and analysis of simulation experiments based on Kriging (Gaussian process model); moreover, it uses bootstrapping to quantify the variability in the estimated Kriged metamodels.
Abstract: Optimization of simulated systems is the goal of many methods, but most methods as- sume known environments. We, however, develop a `robust' methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by Kriging. We illustrate the resulting methodology through classic Economic Order Quantity (EOQ) inventory models. Our results suggest that robust optimization requires order quantities that dier from the classic EOQ. We also compare our latest results with our previous results that do not use Kriging but Response Surface Methodology (RSM).

72 citations

Journal ArticleDOI
TL;DR: In this article, a robust methodology that accounts for uncertain environments is proposed, which uses Taguchi's view of the uncertain world but replaces his statistical techniques by design and analysis of simulation experiments based on Kriging (Gaussian process model); moreover, they use bootstrapping to quantify the variability in the estimated kriging metamodels.
Abstract: Optimization of simulated systems is the goal of many methods, but most methods assume known environments. We, however, develop a “robust” methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world but replaces his statistical techniques by design and analysis of simulation experiments based on Kriging (Gaussian process model); moreover, we use bootstrapping to quantify the variability in the estimated Kriging metamodels. In addition, we combine Kriging with nonlinear programming, and we estimate the Pareto frontier. We illustrate the resulting methodology through economic order quantity (EOQ) inventory models. Our results suggest that robust optimization requires order quantities that differ from the classic EOQ. We also compare our results with results we previously obtained using response surface methodology instead of Kriging.

71 citations