scispace - formally typeset
Search or ask a question
Author

L. Jeff Hong

Bio: L. Jeff Hong is an academic researcher from Fudan University. The author has contributed to research in topics: Estimator & Selection (genetic algorithm). The author has an hindex of 24, co-authored 91 publications receiving 2166 citations. Previous affiliations of L. Jeff Hong include Hong Kong University of Science and Technology & City University of Hong Kong.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, an optimization-via-simulation algorithm, called COMPASS, was proposed for estimating the performance measure via a stochastic, discrete-event simulation, and the decision variables were integer ordered.
Abstract: We propose an optimization-via-simulation algorithm, called COMPASS, for use when the performance measure is estimated via a stochastic, discrete-event simulation, and the decision variables are integer ordered. We prove that COMPASS converges to the set of local optimal solutions with probability 1 for both terminating and steady-state simulation, and for both fully constrained problems and partially constrained or unconstrained problems under mild conditions.

261 citations

Journal ArticleDOI
TL;DR: It is shown that the solutions of the sequence of approximations converge to a Karush-Kuhn-Tucker (KKT) point of the JCCP under a certain asymptotic regime.
Abstract: When there is parameter uncertainty in the constraints of a convex optimization problem, it is natural to formulate the problem as a joint chance constrained program (JCCP), which requires that all constraints be satisfied simultaneously with a given large probability. In this paper, we propose to solve the JCCP by a sequence of convex approximations. We show that the solutions of the sequence of approximations converge to a Karush-Kuhn-Tucker (KKT) point of the JCCP under a certain asymptotic regime. Furthermore, we propose to use a gradient-based Monte Carlo method to solve the sequence of convex approximations.

186 citations

Journal ArticleDOI
TL;DR: This paper proposes an infinitesimal-perturbation-analysis (IPA) estimator and shows that the quantile sensitivities can be written in the form of conditional expectations, and obtains a consistent estimator by dividing data into batches and averaging the IPA estimates of all batches.
Abstract: Quantiles of a random performance serve as important alternatives to the usual expected value. They are used in the financial industry as measures of risk and in the service industry as measures of service quality. To manage the quantile of a performance, we need to know how changes in the input parameters affect the output quantiles, which are called quantile sensitivities. In this paper, we show that the quantile sensitivities can be written in the form of conditional expectations. Based on the conditional-expectation form, we first propose an infinitesimal-perturbation-analysis (IPA) estimator. The IPA estimator is asymptotically unbiased, but it is not consistent. We then obtain a consistent estimator by dividing data into batches and averaging the IPA estimates of all batches. The estimator satisfies a central limit theorem for the i.i.d. data, and the rate of convergence is strictly slower than n-1/3. The numerical results show that the estimator works well for practical problems.

139 citations

Journal ArticleDOI
TL;DR: This paper proves that the CVaR sensitivity can be written as a conditional expectation for general loss distributions, and proposes and demonstrates how to use the estimator to solve optimization problems withCVaR objective and/or constraints, and compares it to a popular linear programming-based algorithm.
Abstract: Conditional value at risk (CVaR) is both a coherent risk measure and a natural risk statistic. It is often used to measure the risk associated with large losses. In this paper, we study how to estimate the sensitivities of CVaR using Monte Carlo simulation. We first prove that the CVaR sensitivity can be written as a conditional expectation for general loss distributions. We then propose an estimator of the CVaR sensitivity and analyze its asymptotic properties. The numerical results show that the estimator works well. Furthermore, we demonstrate how to use the estimator to solve optimization problems with CVaR objective and/or constraints, and compare it to a popular linear programming-based algorithm.

127 citations

Proceedings ArticleDOI
13 Dec 2009
TL;DR: Three types of OvS problems are introduced: the R&S problems, the continuous OvSblems and the discrete OvS Problems, and issues and current research development for these problems are discussed.
Abstract: Optimization via simulation (OvS) is an exciting and fast developing area for both research and practice. In this article, we introduce three types of OvS problems: the R&S problems, the continuous OvS problems and the discrete OvS problems, and discuss the issues and current research development for these problems. We also give some suggestions on how to use commercial OvS software in practice.

123 citations


Cited by
More filters
Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Book ChapterDOI
01 Jan 2011
TL;DR: Weakconvergence methods in metric spaces were studied in this article, with applications sufficient to show their power and utility, and the results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables.
Abstract: The author's preface gives an outline: "This book is about weakconvergence methods in metric spaces, with applications sufficient to show their power and utility. The Introduction motivates the definitions and indicates how the theory will yield solutions to problems arising outside it. Chapter 1 sets out the basic general theorems, which are then specialized in Chapter 2 to the space C[0, l ] of continuous functions on the unit interval and in Chapter 3 to the space D [0, 1 ] of functions with discontinuities of the first kind. The results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables. " The book develops and expands on Donsker's 1951 and 1952 papers on the invariance principle and empirical distributions. The basic random variables remain real-valued although, of course, measures on C[0, l ] and D[0, l ] are vitally used. Within this framework, there are various possibilities for a different and apparently better treatment of the material. More of the general theory of weak convergence of probabilities on separable metric spaces would be useful. Metrizability of the convergence is not brought up until late in the Appendix. The close relation of the Prokhorov metric and a metric for convergence in probability is (hence) not mentioned (see V. Strassen, Ann. Math. Statist. 36 (1965), 423-439; the reviewer, ibid. 39 (1968), 1563-1572). This relation would illuminate and organize such results as Theorems 4.1, 4.2 and 4.4 which give isolated, ad hoc connections between weak convergence of measures and nearness in probability. In the middle of p. 16, it should be noted that C*(S) consists of signed measures which need only be finitely additive if 5 is not compact. On p. 239, where the author twice speaks of separable subsets having nonmeasurable cardinal, he means "discrete" rather than "separable." Theorem 1.4 is Ulam's theorem that a Borel probability on a complete separable metric space is tight. Theorem 1 of Appendix 3 weakens completeness to topological completeness. After mentioning that probabilities on the rationals are tight, the author says it is an

3,554 citations

01 Jan 1993

2,271 citations