scispace - formally typeset
Search or ask a question

Showing papers in "Operations Research in 1999"


Journal ArticleDOI
TL;DR: This paper provides a comprehensive review that synthesizes existing results for the single period problem and develops additional results to enrich the existing knowledge base, and reviews and develops insight into a dynamic inventory extension of this problem.
Abstract: In the newsvendor problem, a decision maker facing random demand for a perishable product decides how much of it to stock for a single selling period. This simple problem with its intuitively appealing solution is a crucial building block of stochastic inventory theory, which comprises a vast literature focusing on operational efficiency. Typically in this literature, market parameters such as demand and selling price are exogenous. However, incorporating these factors into the mode l can provide an excellent vehicle for examining how operational problems interact with marketing issues to influence decision making at the firm level. In this paper we examine an extension of the newsvendor problem in which stocking quantity and selling price are set simultaneously. We provide a comprehensive review that synthesizes existing results for the single period problem and develop additional results to enrich the existing knowledge base. We also review and develop insight into a dynamic inventory extension of this problem, and motivate the applicability of such models.

1,579 citations


Journal ArticleDOI
TL;DR: This paper addresses the simultaneous determination of pricing and inventory replenishment strategies in the face of demand uncertainty by analyzing the following single item, periodic review model.
Abstract: This paper addresses the simultaneous determination of pricing and inventory replenishment strategies in the face of demand uncertainty. More specifically, we analyze the following single item, periodic review model. Demands in consecutive periods are independent, but their distributions depend on the item's price in accordance with general stochastic demand functions. The price charged in any given period can be specified dynamically as a function of the state of the system. A replenishment order may be placed at the beginning of some or all of the periods. Stockouts are fully backlogged. We address both finite and infinite horizon models, with the objective of maximizing total expected discounted profit or its time average value, assuming that prices can either be adjusted arbitrarily (upward or downward) or that they can only be decreased. We characterize the structure of an optimal combined pricing and inventory strategy for all of the above types of models. We also develop an efficient value iteration method to compute these optimal strategies. Finally, we report on an extensive numerical study that characterizes various qualitative properties of the optimal strategies and corresponding optimal profit values.

676 citations


Journal ArticleDOI
TL;DR: A Lagrangian-based heuristic for the well-known Set Covering Problem (SCP), which won the first prize in the FASTER competition, and proposes a dynamic pricing scheme for the variables, akin to that used for solving large-scale LPs, to be coupled with subgradient optimization and greedy algorithms.
Abstract: We present a Lagrangian-based heuristic for the well-known Set Covering Problem (SCP). The algorithm was initially designed for solving very large scale SCP instances, involving up to 5,000 rows an...

423 citations


Journal ArticleDOI
TL;DR: In this paper, the authors make extensive computer searches for good parameter sets, with respect to the spectral test, for combined multiple recursive generators of different sizes and compare different implementations and give a specific code in C that is faster than previous implementations of similar generators.
Abstract: Combining parallel multiple recursive sequences provides an efficient way of implementing random number generators with long periods and good structural properties. Such generators are statistically more robust than simple linear congruential generators that fit into a computer word. We made extensive computer searches for good parameter sets, with respect to the spectral test, for combined multiple recursive generators of different sizes. We also compare different implementations and give a specific code in C that is faster than previous implementations of similar generators.

408 citations


Journal ArticleDOI
TL;DR: In a limited computational study using two products, the benefits of solving for the optimal quantities when substitution is considered at the ordering stage over similar computations without considering substitution while ordering are illustrated.
Abstract: We study a single period multiproduct inventory problem with substitution and proportional costs and revenues. We consider N products and N demand classes with full downward substitution, i.e., excess demand for class i can be satisfied using product j for i ≥ j. We first discuss a two-stage profit maximization formulation for the multiproduct substitution problem. We show that a greedy allocation policy is optimal. We use this to write the expected profits and its first partials explicitly. This in turn enables us to prove additional properties of the profit function and several interesting properties of the optimal solution. In a limited computational study using two products, we illustrate the benefits of solving for the optimal quantities when substitution is considered at the ordering stage over similar computations without considering substitution while ordering. Specifically, we show that the benefits are higher with high demand variability, low substitution cost, low profit margins (or low price to cost ratio), high salvage values, and similarity of products in terms of prices and costs.

348 citations


Journal ArticleDOI
TL;DR: An oligopoly with spatially dispersed generators and consumers and with multi-period demand is modeled in this paper and two variants of the model, respectively based on average-cost and marginal-cost pricing for transmission services, are formulated.
Abstract: An oligopoly with spatially dispersed generators and consumers and with multi-period demand is modeled in this paper. The producers are assumed to behave in a Cournot manner with regulated transmission prices. A (generali zed) Nash equilibrium is sought. The story of the game is as follows. Each generator takes its rivals' output (generation, supply, and flows) and the prices for transmission services as fixed when it decides upon its output to maximize its profit; the tr ansmission firm takes the quantities of transmission services demanded by the generators as fixed when it determines the transmission prices according to certain regulatory rules. An equilibrium of the model is a set of generation output at which no generator will obtain more profit if it unilaterally modifies its output from this set, and a set of transmission prices satisfying certain regulatory requirements. A variational inequality approach is used for computing the equilibria of the model. Using the same approach, two variants of the model, respectively based on average-cost and marginal-cost pricing for transmission services, are also formulated. This model is applied to simulate a long-run electricity market where transmission prices are regulated.

307 citations


Journal ArticleDOI
TL;DR: The method the authors study reinforces promising paths at intermediate thresholds by splitting them into subpaths which then evolve independently, which has the effect of dedicating a greater fraction of the computational effort to informative runs.
Abstract: We analyze the performance of a splitting technique for the estimation of rare event probabilities by simulation. A straightforward estimator of the probability of an event evaluates the proportion of simulated paths on which the event occurs. If the event is rare, even a large number of paths may produce little information about its probability using this approach. The method we study reinforces promising paths at intermediate thresholds by splitting them into subpaths which then evolve independently. If implemented appropriately, this has the effect of dedicating a greater fraction of the computational effort to informative runs. We analyze the method for a class of models in which, roughly speaking, the number of states through which each threshold can be crossed is bounded. Under additional assumptions, we identify the optimal degree of splitting at each threshold as the rarity of the event increases: It should be set so that the expected number of subpaths reaching each threshold remains roughly constant. Thus implemented, the method is provably effective in a sense appropriate to rare event simulations. These results follow from a branching-process analysis of the method. We illustrate our theoretical results with some numerical examples for queueing models.

271 citations


Journal ArticleDOI
TL;DR: This paper provides a tutorial introduction to option pricing methods, focusing on how they relate to and can be integrated with decision analysis methods, and describes some lessons learned in using these methods to evaluate some real oil and gas investments.
Abstract: Many firms in the oil and gas business have long used decision analysis techniques to evaluate exploration and development opportunities and have looked at recent development in option pricing theory as potentially offering improvements over the decision analysis approach. Unfortunately, it is difficult to discern the benefits of the options approach from the literature on the topic: Most of the published examples greatly oversimplify the kinds of projects encountered in practice, and comparisons are typically made to traditional discounted cash flow analysis, which, unlike the option pricing and decision analytic approaches, does not explicitly consider the uncertainty in project cash flows. In this paper, we provide a tutorial introduction to option pricing methods, focusing on how they relate to and can be integrated with decision analysis methods, and describe some lessons learned in using these methods to evaluate some real oil and gas investments.

264 citations


Journal ArticleDOI
TL;DR: Numerical tests on problems from Air France have demonstrated that this method is capable of solving very large scale problems with thousands of constraints and hundreds of sub problems and of reducing solution time by a factor of about a thousand.
Abstract: This article describes a method for solving the crew rostering problem in air transportation. This problem consists of constructing personalized schedules that assign pairings, days off, and other activities to airline crew members. A generalized set partitioning model and a method using column generation have been used. This method has been adapted in a number of ways to take advantage of the nature of the problem and to accelerate solution. Numerical tests on problems from Air France have demonstrated that this method is capable of solving very large scale problems with thousands of constraints and hundreds of subproblems. The tests have also shown that these adaptations are capable of reducing solution time by a factor of about a thousand. Finally, results from this method are compared with those obtained with the method currently used at Air France.

246 citations


Journal ArticleDOI
TL;DR: This paper presents and solves a single-period, multiproduct, downward substitution model that has one raw material as the production input and produces N different products as outputs and compares three different solution methods.
Abstract: in this paper, we present and solve a single-period, multiproduct, downward substitution model. Our model has one raw material as the production input and produces N different products as outputs. The demands and yields for the products are random. We determine the optimal production input and allocation of the N products to satisfy demands. The problem is modeled as a two-stage stochastic program, which we show can be decomposed into a parameterized network flow problem. We present and compare three different solution methods: a stochastic linear program, a decomposition resulting in a series of network flow subproblems, and a decomposition where the same network flow subproblems are solved by a new greedy algorithm.

209 citations


Journal ArticleDOI
TL;DR: A variety of resource allocation problems in which it is desirable to allocate limited resources equitably among competing activities are reviewed, to help practitioners to formulate and solve diverse resource allocation Problems, and motivate researchers to explore new models and algorithmic approaches.
Abstract: In this expository paper, we review a variety of resource allocation problems in which it is desirable to allocate limited resources equitably among competing activities. Applications for such problems are found in diverse areas, including distribution planning, production planning and scheduling, and emergency services location. Each activity is associated with a performance function, representing, for example, the weighted shortfall of the selected activity level from a specified target. A resource allocation solution is called equitable if no performance function value can be improved without either violating a constraint or degrading an already equal or worse-off (i.e., larger) performance function value that is associated with a different activity. A lexicographic minimax solution determines this equitable solution; that is, it determines the lexicographically smallest vector whose elements, the performance function values, are sorted in nonincreasing order. The problems reviewed include large-scale allocation problems with multiple knapsack resource constraints, multiperiod allocation problems for storable resources, and problems with substitutable resources. The solution of large-scale problems necessitates the design of efficient algorithms that take advantage of special mathematical structures. Indeed, efficient algorithms for many models will be described. We expect that this paper will help practitioners to formulate and solve diverse resource allocation problems, and motivate researchers to explore new models and algorithmic approaches.

Journal ArticleDOI
TL;DR: In this article, a multicomponent, multiproduct production and inventory system was studied in which individual components are made to stock but final products are assembled to customer orders, and the key performance measures, including the probability of fulfilling a customer order within any specified time window, were derived.
Abstract: We study a multicomponent, multiproduct production and inventory system in which individual components are made to stock but final products are assembled to customer orders Each component is produced by an independent production facility with finite capacity, and the component inventory is controlled by an independent base-stock policy For any given base-stock policy, we derive the key performance measures, including the probability of fulfilling a customer order within any specified time window Computational procedures and numerical examples are also presented A similar approach applies to the generic multi-item make-to-stock inventory systems in which a typical customer order consists of a kit of items

Journal ArticleDOI
TL;DR: This paper describes a method for estimating the future value function by multivariate adaptive regression splines (MARS) fit over a discretization scheme based on orthogonal array (OA) experimental designs and shows that this method is accurately able to solve higher dimensional SDP problems than previously possible.
Abstract: In stochastic dynamic programming (SDP) with continuous state and decision variables, the future value function is computed at discrete points in the state spac e. Interpolation can be used to approximate the values of the future value function between these discrete points. However, for large dimensional problems the number of discrete points required to obtain a good approximation of the future value function can be prohibitively large. Statistical methods of experimental design and function estimation may be employed to overcome this "curse of dimensionality." In this paper, we describe a method for estimating the future value function by multivariate adaptive regression splines (MARS) fit over a discretization scheme based on orthogonal array (OA) experimental designs. Because orthogonal arrays only grow polynomially in the state-space dimension, our OA/MARS method is accurately able to solve higher dimensional SDP problems than previously possible. To our knowledge, the most efficient method published prior to this work employs tensor-product cubic splines to approximate the future value function (Johnson et al. 1993). The computational advantages of OA/MA RS are demonstrated in comparisons with the method using tensor-product cubic splines for applications of an inventory forecasting SDP with up to nine state variables computed on a small workstation. In particular, the storage of an adequate tensor-product cubic spline for six dimensions exceeds the memory of our workstation, and the run time for an accurate OA/MARS SDP solution would be at least an order of magnitude faster than using tensor-product cubic splines for higher than six dimensions.

Journal ArticleDOI
TL;DR: In this article, strong and fast linear programming lower bounds are computed for an important class of machine scheduling problems with additive objective functions, where the order of the jobs in the relevant part of the schedule is obtained through some priority rule.
Abstract: Parallel machine scheduling problems concern the scheduling of n jobs on m machines to minimize some function of the job completion times. If preemption is not allowed, then most problems are not only NP-hard, but also very hard from a practical point of view. In this paper, we show that strong and fast linear programming lower bounds can be computed for an important class of machine scheduling problems with additive objective functions. Characteristic of these problems is that on each machine the order of the jobs in the relevant part of the schedule is obtained through some priority rule. To that end, we formulate these parallel machine scheduling problems as a set covering problem with an exponential number of binary variables, n covering constraints, and a single side constraint. We show that the linear programming relaxation can be solved efficiently by column generation because the pricing problem is solvable in pseudo-polynomial time. We display this approach on the problem of minimizing total weighted completion time on m identical machines. Our computational results show that the lower bound is singularly strong and that the outcome of the linear program is often integral. Moreover, they show that our branch-and-bound algorithm that uses the linear programming lower bound outperforms the previously best algorithm.

Journal ArticleDOI
TL;DR: In this paper, the authors study the weighted tardiness job-shop scheduling problem, taking into consideration the presence of random shop disturbances, and develop a decomposition method that partitions job operations into an ordered sequence of subsets and resolves a "crucial subset" of scheduling decisions through the use of a branch-and-bound algorithm.
Abstract: In this paper we study the weighted tardiness job-shop scheduling problem, taking into consideration the presence of random shop disturbances. A basic thesis of the paper is that global scheduling performance is determined primarily by a subset of the scheduling decisions to be made. By making these decisions in an a priori static fashion, which maintains a global perspective, overall performance efficiency can be achieved. Further, by allowing the remaining decisions to be made dynamically, flexibility can be retained in the schedule to compensate for unforeseen system disturbances. We develop a decomposition method that partitions job operations into an ordered sequence of subsets. This decomposition identifies and resolves a "crucial subset" of scheduling decisions through the use of a branch-and-bound algorithm. We conduct computational experiments that demonstrate the performance of the approach under deterministic cases, and the robustness of the approach under a wide range of processing time perturbations. We show that the performance of the method is superior, particularly for low to medium levels of disturbances.

Journal ArticleDOI
TL;DR: A model that simultaneously plans capacity investment, inventory investment, and the production schedule using return on assets as the objective to maximize is developed and an algorithm is developed that optimizes a fractional objective function for a mixed-integer program.
Abstract: Manufacturing managers often address capacity and inventory decisions separately, thus ignoring the interaction between capacity and inventory within a manufacturing system. The separation of these two decisions can lead to an imbalance of capacity and inventory investment. We develop a model that simultaneously plans capacity investment, inventory investment, and the production schedule using return on assets as the objective to maximize. An algorithm is developed that optimizes a fractional objective function for a mixed-integer program. The model was applied at an electronics manufacturer and at a manufacturer of office supplies.

Journal ArticleDOI
TL;DR: This analysis of a single item periodic review inventory problem with random yield and stochastic demand in terms of the inventory position at the end of a period provides interesting insights into the problem and leads to easily implementable and highly accurate myopic heuristics.
Abstract: We consider a single item periodic review inventory problem with random yield and stochastic demand. The yield is proportional to the quantity ordered, with the multiplicative factor being a random variable. The demands are stochastic and are independent across the periods, but they need not be stationary. The holding, penalty, and ordering costs are linear. Any unsatisfied demands are backlogged. Two cases for the ordering cost are considered: The ordering cost can be proportional to either the quantity ordered (e.g., in house production) or the quantity received (e.g., delivery by an external supplier). Random yield problems have been addressed previously in the literature, but no constructive solutions or algorithms are presented except for simple heuristics that are far from optimal. In this paper, we present a novel analysis of the problem in terms of the inventory position at the end of a period. This analysis provides interesting insights into the problem and leads to easily implementable and highly accurate myopic heuristics. A detailed computational study is done to evaluate the heuristics. The study is done for the infinite horizon case, with stationary yields and demands and for the finite horizon case with a 26-period seasonal demand pattern. The best of our heuristics has worst-case errors of 3.0% and 5.0% and average errors of 0.6% and 1.2% for the infinite and finite horizon cases, respectively.

Journal ArticleDOI
TL;DR: This work analyzes an (s, S) continuous review perishable inventory system with a general renewal demand process and instantaneous replenishments, and obtains closed-form solutions for the steady state probability distribution of the inventory level and system performance measures.
Abstract: We analyze an (s, S) continuous review perishable inventory system with a general renewal demand process and instantaneous replenishments. Though continuous review systems seem more amenable to optimization analysis than do periodic review systems, the existing literature addressing this type of model is rather limited. This limitation motivated us to seek greater understanding of this important class of inventory models. Using a Markov renewal approach, we obtain closed-form solutions for the steady state probability distribution of the inventory level and system performance measures. We then construct a closed-form expected cost function. Useful analytical properties for the cost function are identified and extensive computations are conducted to examine the impact of different parameters. The numerical results illustrate the system behavior and lead to managerial insights into controlling such inventory systems.

Journal ArticleDOI
TL;DR: A review of the literature on warranty models and analysis methods is provided, along with some suggestions for further research.
Abstract: Product guarantees or warranties have been around for generations, but formal approaches for establishing and examining warranties have been considered only during the past 20 years. A review of the literature on warranty models and analysis methods is provided, along with some suggestions for further research.

Journal ArticleDOI
TL;DR: All possible asymptotic behavior of "bucket brigade" production lines with two or three workers, each characterized by a constant work velocity are described, suggesting wariness in interpreting simulation results.
Abstract: We describe all possible asymptotic behavior of "bucket brigade" production lines with two or three workers, each characterized by a constant work velocity. The results suggest wariness in interpreting simulation results. They also suggest a strategy for partitioning a workforce into effective teams to staff the lines.

Journal ArticleDOI
TL;DR: In this article, it is shown that it is not reasonable to assume a uniform distribution of the weights in the core, and it is experimentally shown that the heuristic proposed by Balas and Zemel does not find as good solutions as expected.
Abstract: Since Balas and Zemel in the 1980s introduced the so-called core problem as an efficient tool for solving the Knapsack Problem, all the most successful algorithms have applied this concept. Balas and Zemel proved that if the weights in the core are uniformly distributed then there is a high probability for finding an optimal solution in the core. Items outside the core may be fathomed because of reduction rules. This paper demonstrates that generally it is not reasonable to assume a uniform distribution of the weights in the core, and it is experimentally shown that the heuristic proposed by Balas and Zemel does not find as good solutions as expected. Also, other algorithms that solve some kind of core problem may be stuck by difficult cores. This behavior has apparently not been noticed before because of unsufficient testing. Capacities leading to difficult problems are identified for several categories of instance types, and it is demonstrated that the hitherto applied test instances are easier than the average. As a consequence we propose a series of new randomly generated test instances and show how recent algorithms behave when applied to these problems.

Journal ArticleDOI
TL;DR: It is shown that it is never optimal for one player to be stationary during the entire search period in the two-player rendezvous and the meeting time of n-players in the worst case has an asymptotic behavior of n = 2 + O(log n).
Abstract: We present two new results for the asymmetric rendezvous problem on the line. We first show that it is never optimal for one player to be stationary during the entire search period in the two-player rendezvous. Then we consider the meeting time of n-players in the worst case and show that it has an asymptotic behavior of n = 2 + O(log n).

Journal ArticleDOI
TL;DR: British Columbia Gas, a major utility, was required by the British Columbia Utilities Commission to develop an integrated resource plan that addressed multiple objectives and involved the participation of stakeholders and elicited values from most of the senior executives at BC Gas, members of the BCUC, and representatives of several stakeholder groups.
Abstract: British Columbia Gas, a major utility, was required by the British Columbia Utilities Commission (BCUC) to develop an integrated resource plan that addressed multiple objectives and involved the participation of stakeholders. To assist BC Gas, we elicited values separately from most of the senior executives at BC Gas, members of the BCUC, and representatives of several stakeholder groups. Based on these values, we structured a set of objectives and associated performance measures for integrated resource planning (IRP) at BC Gas. A multistakeholder process then provided judgments about appropriate value tradeoffs among these objectives. This information was used in several ways in the IRP process. It fostered improved communication and served as a guide for designing more attractive plans and identifying future information needs. It also provided the basis for a quantitative evaluation of alternative plans and resources. Both the IRP process and the chosen integrated resource plan were reviewed by lawyers representing intervenors at a quasi-judicial hearing of the BCUC. Their concerns are informative in providing lessons for the use of the elicited values in the context of regulatory hearings.

Journal ArticleDOI
TL;DR: This paper describes these distributions in detail and shows their suitability to model self-similar behavior, and develops a so-called truncated analytical model that in the limit is power-tail.
Abstract: Power-tail distributions are those for which the reliability function is of the form x-α for large x. Although they look well behaved, they have the singular property that E(Xl) = infinity for all l ≥ α. Thus it is possible to have a distribution with an infinite variance, or even an infinite mean. As pathological as these distributions seem to be, they occur everywhere in nature, from the CPU time used by jobs on main-frame computers to sizes of files stored on discs, earthquakes, or even health insurance claims. Recently, traffic on the "electronic super highway" was revealed to be of this type, too. In this paper we first describe these distributions in detail and show their suitability to model self-similar behavior, e.g., of the traffic stated above. Then we show how these distributions can occur in computer system environments and develop a so-called truncated analytical model that in the limit is power-tail. We study and compare the effects on system performance of a GI/M/1 model both for the truncated and the limit case, and demonstrate the usefulness of these approaches particularly for Markov modeling with LAQT (Linear Algebraic Approach to Queueing Theory, Lipsky 1992) techniques.

Journal ArticleDOI
TL;DR: A parable is recites and a stochastic optimization model is formulates that determines optimal link tolls on a road network whose users' value of time is a random variable and induces an equilibrium traffic flow that is at once system-optimal and user- optimal-for all trips, regardless of their value ofTime.
Abstract: Part I of a two-part series, this paper recites a parable and formulates a stochastic optimization model that determines optimal link tolls on a road network whose users' value of time is a random variable. The parable, introducing the problem, demonst rates the importance of the variability of the value of time. The model, cast as a variational inequality, becomes a specialized form of a bicriterion user-quilibrium traffic assignment. Its solution is a set of efficient tolls for all links in the network. These tolls induce an equilibrium traffic flow that is at once system-optimal and user-optimal-for all trips, regardless of their value of time. Part II develops a solution algorithm, gives examples, and provides performance statistics.

Journal ArticleDOI
TL;DR: In this paper, the authors define a family of random trees in the plane, where nodes of level k, k = 0,...,m are the points of a homogeneous Poisson point process, whereas their arcs connect nodes of levels k and k + 1, according to the least distance principle.
Abstract: We define a family of random trees in the plane. Their nodes of level k, k = 0,...,m are the points of a homogeneous Poisson point process Πk, whereas their arcs connect nodes of level k and k + 1, according to the least distance principle: If V denotes the Voronoi cell w.r.t. Πk+1 with nucleus x, where x is a point of Πk+1, then there is an arc connecting x to all the points of Πk that belong to V. This creates a family of stationary random trees rooted in the points of Πm. These random trees are useful to model the spatial organization of several types of hierarchical communication networks. In relation to these communication networks, it is natural to associate various cost functions with such random trees. Using point process techniques, like the exchange formula between two Palm measures, and integral geometry techniques, we show how to compute these average costs as functions of the intensity parameters of the Poisson processes. The formulas derived for the average value of these cost functions can then be exploited for parametric optimization purposes. Several applications to classical and mobile cellular communication networks are presented.

Journal ArticleDOI
TL;DR: A fully polynomial approximation scheme for the problem of scheduling n jobs on a single machine to minimize total weighted earliness and tardiness is presented and recursively computes lower and upper bounds on the value of partial optimal solutions.
Abstract: A fully polynomial approximation scheme for the problem of scheduling n jobs on a single machine to minimize total weighted earliness and tardiness is presented. A new technique is used to develop the scheme. The main feature of this technique is that it recursively computes lower and upper bounds on the value of partial optimal solutions. Therefore, the scheme does not require any prior knowledge of lower and upper bounds on the value of a complete optimal solution. This distinguishes it from all the existing approximation schemes.

Journal ArticleDOI
TL;DR: A procedure that, without using the SP matrix, computes a lower bound to the CSP by finding a heuristic solution to the dual of the linear relaxation of SP, obtained by combining a number of different bounding procedures.
Abstract: The crew scheduling problem (CSP) appears in many mass transport systems (e.g., airline, bus, and railway industry) and consists of scheduling a number of crews to operate a set of transport tasks satisfying a variety of constraints. This problem is formulated as a set partitioning problem with side constraints (SP), where each column of the SP matrix corresponds to a feasible duty, which is a subset of tasks performed by a crew. We describe a procedure that, without using the SP matrix, computes a lower bound to the CSP by finding a heuristic solution to the dual of the linear relaxation of SP. Such dual solution is obtained by combining a number of different bounding procedures. The dual solution is used to reduce the number of variables in the SP in such a way that the resulting SP problem can be solved by a branch-and-bound algorithm. Computational results are given for problems derived from the literature and involving from 50 to 500 tasks.

Journal ArticleDOI
TL;DR: In this paper, the authors consider an assignment problem in which persons are qualified for some but usually not all of the jobs and assume persons belong to given seniority classes and jobs have given priority levels.
Abstract: Consider an assignment problem in which persons are qualified for some but usually not all of the jobs. Moreover, assume persons belong to given seniority classes and jobs have given priority levels. Seniority constraints impose that the solution be such that no unassigned person can be given a job unless an assigned person with the same or higher seniority becomes unassigned. Priority constraints specify that the solution must be such that no unassigned job can become assigned without a job with the same or higher priority becoming unassigned. It is shown that: (i) adding such constraints does not reduce and may even increase the number of assigned persons in the optimal solution; (ii) using a greedy heuristic for constrained assignment (as often done in practice) may reduce the number of assigned persons by half, and (iii) an optimal solution to the assignment problem with both types of constraints can be obtained by solving a classical assignment problem with adequately modified coefficients.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a strategic guideline as to how the design process should be managed and controlled and describe how design reviews and engineering resources can be scheduled as the control mechanisms to operationally manage development risk.
Abstract: Product development has become the focal point of industrial competition and is the cornerstone of long-term survival for most firms. One of the major management challenges in product development is to deal with development risk in the design process. In this paper we provide a strategic guideline as to how the design process should be managed and controlled. We describe how design reviews and engineering resources can be scheduled as the control mechanisms to operationally manage development risk. The methodologies developed are an integral part of a project to fundamentally restructure product design processes at Rocketdyne Division of Rockwell International, which designs and develops liquid-propellant rocket propulsion systems.