scispace - formally typeset
Search or ask a question
Author

Lawrence M. Wein

Bio: Lawrence M. Wein is an academic researcher from Stanford University. The author has contributed to research in topics: Queueing theory & Dynamic priority scheduling. The author has an hindex of 50, co-authored 136 publications receiving 9449 citations. Previous affiliations of Lawrence M. Wein include George Mason University & World Health Organization.


Papers
More filters
Journal ArticleDOI
TL;DR: It is argued that agency-related costs and information transfer costs will tend drive the locus of problem-solving in the opposite direction-away from problem-Solving by specialist suppliers, and towards those who directly benefit from a solution and who have difficult-to-transfer local information about a particular application being solved, such as the direct users of a product or service.
Abstract: Those who solve more of a given type of problem tend to get better at it-which suggests that problems of any given type should be brought to specialists for a solution. However, in this paper we argue that agency-related costs and information transfer costs ("sticky" local information) will tend drive the locus of problem-solving in the opposite direction-away from problem-solving by specialist suppliers, and towards those who directly benefit from a solution and who have difficult-to-transfer local information about a particular application being solved, such as the direct users of a product or service. We examine the actual location of design activities in two fields in which custom products are produced by "mass-customization" methods: application-specific integrated circuits (ASICs) and computer telephony integration (CTI) systems. In both, we find that users rather than suppliers are the actual designers of the application-specific portion of the product types examined. We offer anecdotal evidence that the pattern of user-based customization we have documented in these two fields is in fact quite general, and we discuss implications for research and practice.

898 citations

Journal ArticleDOI
TL;DR: In this paper, a variety of input control and sequencing rules are evaluated using a simulation model of a representative, but fictitious, semiconductor wafer fabrication, and the simulation results indicate that scheduling has a significant impact on average throughput time, with larger improvements coming from discretionary imput control than from lot sequencing.
Abstract: The impact that scheduling can have on the performance of semi-conductor wafer fabrication facilities is assessed. The performance measure considered is the mean throughput time (sometimes called cycle time, turnaround time or manufacturing interval) for a lot of wafers. A variety of input control and sequencing rules are evaluated using a simulation model of a representative, but fictitious, semiconductor wafer fabrication. Certain of these scheduling rules are derived by restricting attention to the sub-set of stations that are heavily utilized, and by using a Brownian network model, which approximates a multi-class queuing network model with dynamic control capability. Three versions of the wafer fabrication model, which differ only by the number of servers present at particular stations, are studied. The three versions have one, two, and four stations, respectively, that are heavily utilized (near 90% utilization). The simulation results indicate that scheduling has a significant impact on average throughput time, with larger improvements coming from discretionary imput control than from lot sequencing. The effects that specific sequencing rules have are highly dependent on both the type of input control used and the number of bottleneck stations in the fabrication. >

643 citations

Journal ArticleDOI
TL;DR: Estimating the number of cases and deaths that would result from an attack in a large urban area of the United States and comparing the results to mass vaccination from the moment an attack is recognized finds that mass vaccination results in both far fewer deaths and much faster epidemic eradication.
Abstract: In the event of a smallpox bioterrorist attack in a large U.S. city, the interim response policy is to isolate symptomatic cases, trace and vaccinate their contacts, quarantine febrile contacts, but vaccinate more broadly if the outbreak cannot be contained by these measures. We embed this traced vaccination policy in a smallpox disease transmission model to estimate the number of cases and deaths that would result from an attack in a large urban area. Comparing the results to mass vaccination from the moment an attack is recognized, we find that mass vaccination results in both far fewer deaths and much faster epidemic eradication over a wide range of disease and intervention policy parameters, including those believed most likely, and that mass vaccination similarly outperforms the existing policy of starting with traced vaccination and switching to mass vaccination only if required.

461 citations

Journal ArticleDOI
TL;DR: In this article, the procurement of new components for recyclable products in the context of Kodak's single-use camera is addressed, where the objective is to find an ordering policy that minimizes the total expected procurement, inventory holding, and lost sales cost.
Abstract: We address the procurement of new components for recyclable products in the context of Kodak's single-use camera. The objective is to find an ordering policy that minimizes the total expected procurement, inventory holding, and lost sales cost. Distinguishing characteristics of the system are the uncertainty and unobservability associated with return flows of used cameras. We model the system as a closed queueing network, develop a heuristic procedure for adaptive estimation and control, and illustrate our methods with disguised data from Kodak. Using this framework, we investigate the effects of various system characteristics such as informational structure, procurement delay, demand rate, and length of the product's life cycle.

429 citations

Journal ArticleDOI
TL;DR: A mathematical model of a cows-to-consumers supply chain associated with a single milk-processing facility that is the victim of a deliberate release of botulinum toxin reveals the potential to eliminate the threat of this scenario at a cost of <1 cent per gallon and should be pursued aggressively.
Abstract: We developed a mathematical model of a cows-to-consumers supply chain associated with a single milk-processing facility that is the victim of a deliberate release of botulinum toxin. Because centralized storage and processing lead to substantial dilution of the toxin, a minimum amount of toxin is required for the release to do damage. Irreducible uncertainties regarding the dose–response curve prevent us from quantifying the minimum effective release. However, if terrorists can obtain enough toxin, and this may well be possible, then rapid distribution and consumption result in several hundred thousand poisoned individuals if detection from early symptomatics is not timely. Timely and specific in-process testing has the potential to eliminate the threat of this scenario at a cost of <1 cent per gallon and should be pursued aggressively. Investigation of improving the toxin inactivation rate of heat pasteurization without sacrificing taste or nutrition is warranted.

361 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Book ChapterDOI
01 Jan 2011
TL;DR: Weakconvergence methods in metric spaces were studied in this article, with applications sufficient to show their power and utility, and the results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables.
Abstract: The author's preface gives an outline: "This book is about weakconvergence methods in metric spaces, with applications sufficient to show their power and utility. The Introduction motivates the definitions and indicates how the theory will yield solutions to problems arising outside it. Chapter 1 sets out the basic general theorems, which are then specialized in Chapter 2 to the space C[0, l ] of continuous functions on the unit interval and in Chapter 3 to the space D [0, 1 ] of functions with discontinuities of the first kind. The results of the first three chapters are used in Chapter 4 to derive a variety of limit theorems for dependent sequences of random variables. " The book develops and expands on Donsker's 1951 and 1952 papers on the invariance principle and empirical distributions. The basic random variables remain real-valued although, of course, measures on C[0, l ] and D[0, l ] are vitally used. Within this framework, there are various possibilities for a different and apparently better treatment of the material. More of the general theory of weak convergence of probabilities on separable metric spaces would be useful. Metrizability of the convergence is not brought up until late in the Appendix. The close relation of the Prokhorov metric and a metric for convergence in probability is (hence) not mentioned (see V. Strassen, Ann. Math. Statist. 36 (1965), 423-439; the reviewer, ibid. 39 (1968), 1563-1572). This relation would illuminate and organize such results as Theorems 4.1, 4.2 and 4.4 which give isolated, ad hoc connections between weak convergence of measures and nearness in probability. In the middle of p. 16, it should be noted that C*(S) consists of signed measures which need only be finitely additive if 5 is not compact. On p. 239, where the author twice speaks of separable subsets having nonmeasurable cardinal, he means "discrete" rather than "separable." Theorem 1.4 is Ulam's theorem that a Borel probability on a complete separable metric space is tight. Theorem 1 of Appendix 3 weakens completeness to topological completeness. After mentioning that probabilities on the rationals are tight, the author says it is an

3,554 citations

Journal ArticleDOI
TL;DR: In this paper, the authors take the community of practice as a unifying unit of analysis for understanding knowledge in the firm, and suggest that often too much attention is paid to the idea of community, too little to the implications of practice.
Abstract: While the recent focus on knowledge has undoubtedly benefited organizational studies, the literature still presents a sharply contrasting and even contradictory view of knowledge, which at times is described as "sticky" and at other times "leaky." This paper is written on the premise that there is more than a problem with metaphors at issue here, and more than accounts of different types of knowledge (such as "tacit" and "explicit") can readily explain. Rather, these contrary descriptions of knowledge reflect different, partial, and sometimes "balkanized" perspectives from which knowledge and organization are viewed. Taking the community of practice as a unifying unit of analysis for understanding knowledge in the firm, the paper suggests that often too much attention is paid to the idea of community, too little to the implications of practice. Practice, we suggest, creates epistemic differences among the communities within a firm, and the firm's advantage over the market lies in dynamically coordinating the knowledge produced by these communities despite such differences. In making this argument, we argue that analyses of systemic innovation should be extended to embrace all firms in a knowledge economy, not just the classically innovative. This extension will call for a transformation of conventional ideas coordination and of the trade-off between exploration and exploitation.

3,382 citations

Book ChapterDOI
TL;DR: This chapter extends the newsvendor model by allowing the retailer to choose the retail price in addition to the stocking quantity, and discusses an infinite horizon stochastic demand model in which the retailer receives replenishments from a supplier after a constant lead time.
Abstract: Publisher Summary This chapter reviews the supply chain coordination with contracts. Numerous supply chain models are discussed. In each model, the supply chain optimal actions are identified. The chapter extends the newsvendor model by allowing the retailer to choose the retail price in addition to the stocking quantity. Coordination is more complex in this setting because the incentives provided to align one action might cause distortions with the other action. The newsvendor model is also extended by allowing the retailer to exert costly effort to increase demand. Coordination is challenging because the retailer's effort is noncontractible—that is, the firms cannot write contracts based on the effort chosen. The chapter also discusses an infinite horizon stochastic demand model in which the retailer receives replenishments from a supplier after a constant lead time. Coordination requires that the retailer chooses a large basestock level.

2,626 citations