scispace - formally typeset
Search or ask a question
Author

David A. Goldberg

Bio: David A. Goldberg is an academic researcher from Cornell University. The author has contributed to research in topics: Queue & Lead time. The author has an hindex of 16, co-authored 46 publications receiving 604 citations. Previous affiliations of David A. Goldberg include Los Alamos National Laboratory & Georgia Institute of Technology.

Papers
More filters
Journal ArticleDOI
TL;DR: This paper shows that when lead times are large, a very simple constant-order policy, first studied by Reiman, performs nearly optimally and combines a novel coupling for suprema of random walks with arguments from queueing theory.
Abstract: Lost sales inventory models with large lead times, which arise in many practical settings, are notoriously difficult to optimize due to the curse of dimensionality. In this paper, we show that when lead times are large, a very simple constant-order policy, first studied by Reiman, performs nearly optimally. The main insight of our work is that when the lead time is very large, such a significant amount of randomness is injected into the system between when an order for more inventory is placed and when the order is received, that “being smart” algorithmically provides almost no benefit. Our main proof technique combines a novel coupling for suprema of random walks with arguments from queueing theory.

71 citations

Journal ArticleDOI
TL;DR: The main proof technique combines novel convexity and lower-bounding arguments, an explicit implementation of the vanishing discount factor approach to analyzing infinite-horizon Markov decision processes, and ideas from the theory of random walks and queues, significantly extending the methodology and applicability of a novel framework for analyzing inventory models with large lead times.
Abstract: Dual-sourcing inventory systems, in which one supplier is faster i.e., express and more costly, while the other is slower i.e., regular and cheaper, arise naturally in many real-world supply chains. These systems are notoriously difficult to optimize because of the complex structure of the optimal solution and the curse of dimensionality, having resisted solution for over 40 years. Recently, so-called tailored base-surge TBS policies have been proposed as a heuristic for the dual-sourcing problem. Under such a policy, a constant order is placed at the regular source in each period, while the order placed at the express source follows a simple order-up-to rule. Numerical experiments by several authors have suggested that such policies perform well as the lead time difference between the two sources grows large, which is exactly the setting in which the curse of dimensionality leads to the problem becoming intractable. However, providing a theoretical foundation for this phenomenon has remained a major open problem. In this paper, we provide such a theoretical foundation by proving that a simple TBS policy is indeed asymptotically optimal as the lead time of the regular source grows large, with the lead time of the express source held fixed. Our main proof technique combines novel convexity and lower-bounding arguments, an explicit implementation of the vanishing discount factor approach to analyzing infinite-horizon Markov decision processes, and ideas from the theory of random walks and queues, significantly extending the methodology and applicability of a novel framework for analyzing inventory models with large lead times recently introduced by Goldberg and coauthors in the context of lost-sales models with positive lead times. This paper was accepted by Gad Allon, operations management.

59 citations

Journal ArticleDOI
TL;DR: This work proves that for the infinite-horizon variant of the same lost sales problem, the optimality gap actually converges exponentially fast to zero, with the optimability gap decaying to zero at least as fast as the exponential rate of convergence of the expected waiting time in a related single-server queue to its steady-state value.
Abstract: Inventory models with lost sales and large lead times have traditionally been considered intractable due to the curse of dimensionality Recently, Goldberg and coauthors laid the foundations for a new approach to solving these models, by proving that as the lead time grows large, a simple constant-order policy is asymptotically optimal However, the bounds proven there require the lead time to be very large before the constant-order policy becomes effective, in contrast to the good numerical performance demonstrated by Zipkin even for small lead time values In this work, we prove that for the infinite-horizon variant of the same lost sales problem, the optimality gap of the same constant-order policy actually converges exponentially fast to zero, with the optimality gap decaying to zero at least as fast as the exponential rate of convergence of the expected waiting time in a related single-server queue to its steady-state value We also derive simple and explicit bounds for the optimality gap, and demons

46 citations

Journal ArticleDOI
TL;DR: In this paper, a simple greedy algorithm for finding large independent sets and matchings in constant-degree regular graphs is presented. But the results are restricted to regular graphs with n nodes and girth at least g. The results imply improved bounds for the size of the largest independent set in these graphs.
Abstract: We derive new results for the performance of a simple greedy algorithm for finding large independent sets and matchings in constant-degree regular graphs. We show that for r-regular graphs with n nodes and girth at least g, the algorithm finds an independent set of expected cardinality \[ f(r)n-O\biggl(\frac{(r-1)^{\frac{g}{2}}}{ \frac{g}{2}!} n\biggr), \] where f(r) is a function which we explicitly compute. A similar result is established for matchings. Our results imply improved bounds for the size of the largest independent set in these graphs, and provide the first results of this type for matchings. As an implication we show that the greedy algorithm returns a nearly perfect matching when both the degree r and girth g are large. Furthermore, we show that the cardinality of independent sets and matchings produced by the greedy algorithm in arbitrary bounded-degree graphs is concentrated around the mean. Finally, we analyse the performance of the greedy algorithm for the case of random i.i.d. weighted independent sets and matchings, and obtain a remarkably simple expression for the limiting expected values produced by the algorithm. In fact, all the other results are obtained as straightforward corollaries from the results for the weighted case.

37 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider the FCFS $GI/G/n$ queue in the so-called Halfin-Whitt heavy traffic regime and derive tight upper and lower bounds on the large deviation exponent of the limiting steady-state queue length matching.
Abstract: We consider the FCFS $GI/G/n$ queue in the so-called Halfin–Whitt heavy traffic regime. We prove that under minor technical conditions the associated sequence of steady-state queue length distributions, normalized by $n^{1/2}$, is tight. We derive an upper bound on the large deviation exponent of the limiting steady-state queue length matching that conjectured by Gamarnik and Momcilovic [Adv. in Appl. Probab. 40 (2008) 548–577]. We also prove a matching lower bound when the arrival process is Poisson. Our main proof technique is the derivation of new and simple bounds for the FCFS $GI/G/n$ queue. Our bounds are of a structural nature, hold for all $n$ and all times $t\geq0$, and have intuitive closed-form representations as the suprema of certain natural processes which converge weakly to Gaussian processes. We further illustrate the utility of this methodology by deriving the first nontrivial bounds for the weak limit process studied in [Ann. Appl. Probab. 19 (2009) 2211–2269].

37 citations


Cited by
More filters
Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Book ChapterDOI
01 Jan 2011
TL;DR: Weakconvergence methods in metric spaces were studied in this article, with applications sufficient to show their power and utility, and the results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables.
Abstract: The author's preface gives an outline: "This book is about weakconvergence methods in metric spaces, with applications sufficient to show their power and utility. The Introduction motivates the definitions and indicates how the theory will yield solutions to problems arising outside it. Chapter 1 sets out the basic general theorems, which are then specialized in Chapter 2 to the space C[0, l ] of continuous functions on the unit interval and in Chapter 3 to the space D [0, 1 ] of functions with discontinuities of the first kind. The results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables. " The book develops and expands on Donsker's 1951 and 1952 papers on the invariance principle and empirical distributions. The basic random variables remain real-valued although, of course, measures on C[0, l ] and D[0, l ] are vitally used. Within this framework, there are various possibilities for a different and apparently better treatment of the material. More of the general theory of weak convergence of probabilities on separable metric spaces would be useful. Metrizability of the convergence is not brought up until late in the Appendix. The close relation of the Prokhorov metric and a metric for convergence in probability is (hence) not mentioned (see V. Strassen, Ann. Math. Statist. 36 (1965), 423-439; the reviewer, ibid. 39 (1968), 1563-1572). This relation would illuminate and organize such results as Theorems 4.1, 4.2 and 4.4 which give isolated, ad hoc connections between weak convergence of measures and nearness in probability. In the middle of p. 16, it should be noted that C*(S) consists of signed measures which need only be finitely additive if 5 is not compact. On p. 239, where the author twice speaks of separable subsets having nonmeasurable cardinal, he means "discrete" rather than "separable." Theorem 1.4 is Ulam's theorem that a Borel probability on a complete separable metric space is tight. Theorem 1 of Appendix 3 weakens completeness to topological completeness. After mentioning that probabilities on the rationals are tight, the author says it is an

3,554 citations

Journal ArticleDOI
TL;DR: In this paper, applied probability and queuing in the field of applied probabilistic analysis is discussed. But the authors focus on the application of queueing in the context of road traffic.
Abstract: (1987). Applied Probability and Queues. Journal of the Operational Research Society: Vol. 38, No. 11, pp. 1095-1096.

1,121 citations