scispace - formally typeset
Search or ask a question

Showing papers by "Reuven Y. Rubinstein published in 1997"


Journal ArticleDOI
TL;DR: Particular emphasis will be placed on estimation of rare events and on integration of the associated performance function into stochastic optimization programs.

710 citations


Journal ArticleDOI
TL;DR: The classical model of Ushakov on redundancy optimization of series-parallel static coherent reliability systems with uncertainty in system parameters is extended and a genetic algorithm for finding the optimal redundancy is developed.
Abstract: This paper extends the classical model of Ushakov on redundancy optimization of series-parallel static coherent reliability systems with uncertainty in system parameters. Their objective function represents the total capacity of a series-parallel static system, while the decision parameters are the nominal capacity and the availability of the elements. They obtain explicit expressions (both analytic and via efficient simulation) for the constraint of the program, viz, for the Cdf of the system total capacity and then show that the extended program is convex mixed-integer. Depending on whether the objective function and the associated constraints are analytically available or not, they suggest using deterministic and stochastic (simulation) optimization approaches, respectively. The last case is associated with likelihood ratios (change of probability measure). A genetic algorithm for finding the optimal redundancy is developed and supporting numerical results are presented.

41 citations


Journal ArticleDOI
TL;DR: A method for fast estimation of probabilities of rare events in stochastic networks, with a particular emphasis on coherent reliability systems, based on the concepts of likelihood-ratios, change of probability measure and the bottleneck-cut in the network.
Abstract: This paper presents a method for fast estimation of probabilities of rare events in stochastic networks, with a particular emphasis on coherent reliability systems. The method is based on the concepts of likelihood-ratios (LR), change of probability measure and the bottleneck-cut in the network. Both polynomial and exponential-time Monte Carlo estimators are defined, and conditions under which the time complexity of the proposed LR estimators is bounded by a polynomial are discussed. The accuracy of the method depends only on the size (cardinality) of the bottleneck-cut, not on the topology and actual size of the network. Supporting numerical results are presented, with the cardinality of the bottleneck-cut /spl les/20.

38 citations


Journal ArticleDOI
TL;DR: This work examines how to combine the score function method with the standard crude Monte Carlo and experimental design approaches, in order to evaluate the expected performance of a discrete event system and its associated gradient simultaneously for different scenarios.
Abstract: In this work, we examine how to combine the score function method with the standard crude Monte Carlo and experimental design approaches, in order to evaluate the expected performance of a discrete event system and its associated gradient simultaneously for different scenarios (combinations of parameter values), as well as to optimize the expected performance with respect to two parameter sets, which represent parameters of the underlying probability law (for the system‘s evolution) and parameters of the sample performance measure, respectively. We explore how the stochastic approximation and stochastic counterpart methods can be combined to perform optimization with respect to both sets of parameters at the same time. We outline three combined algorithms of that form, one sequential and two parallel, and give a convergence proof for one of them. We discuss a number of issues related to the implementation and convergence of those algorithms, introduce averaging variants, and give numerical illustrations.

4 citations


01 Jan 1997
TL;DR: In this paper, the authors proposed a method based on the concepts of likelihood-ratios (LR), change of probability measure (CPM) and the bottleneck-cut in the network.
Abstract: with a particular emphasis on coherent reliability systems. The method is based on the concepts of likelihood-ratios (LR), change of probability measure) and the bottleneck-cut in the network. Both polynomial and exponential-time Monte Carlo estimators are defined, and conditions under which the time complexity of the proposed LR estimators is bounded by a polynomial are discussed. The accuracy of the method depends only on the size (cardinality) of the bottleneck-cut, not on the topology and actual size of the network. Supporting numerical results are presented, with the cardinality of the bottleneck-cut I 20.

1 citations