scispace - formally typeset
Search or ask a question

Showing papers in "Operations Research in 2000"


Journal ArticleDOI
TL;DR: An exact algorithm for filling a single bin is developed, leading to the definition of an exact branch-and-bound algorithm for the three-dimensional bin packing problem, which also incorporates original approximation algorithms.
Abstract: The problem addressed in this paper is that of orthogonally packing a given set of rectangular-shaped items into the minimum number of three-dimensional rectangular bins. The problem is strongly NP-hard and extremely difficult to solve in practice. Lower bounds are discussed, and it is proved that the asymptotic worst-case performance ratio of the continuous lower bound is ?. An exact algorithm for filling a single bin is developed, leading to the definition of an exact branch-and-bound algorithm for the three-dimensional bin packing problem, which also incorporates original approximation algorithms. Extensive computational results, involving instances with up to 90 items, are presented: It is shown that many instances can be solved to optimality within a reasonable time limit.

569 citations


Journal ArticleDOI
TL;DR: A probabilistic demand model for items in an assortment that captures the effects of substitution and a methodology for selecting item inventory levels so as to maximize total expected profit, subject to given resource constraints is developed.
Abstract: Customers for retail merchandise can often be satisfied with one of several items. Accounting for demand substitution in defining customer service influences the choice of items to stock and the optimal inventory level for each item stocked. Further, when certain items are not stocked, the resulting substitutions increase the demand for other items, which also affects the optimal stock levels. In this paper, we develop a probabilistic demand model for items in an assortment that captures the effects of substitution and a methodology for selecting item inventory levels so as to maximize total expected profit, subject to given resource constraints. Illustrative examples are solved to provide insights concerning the behavior of the optimal inventory policies, using the negative binomial demand distribution, which has performed well in fitting retail sales data.

456 citations


Journal ArticleDOI
TL;DR: The Nested Partitions (NP) method, a new randomized method for solving global optimization problems that systematically partitions the feasible region and concentrates the search in regions that are the most promising, is proposed.
Abstract: We propose a new randomized method for solving global optimization problems. This method, the Nested Partitions (NP) method, systematically partitions the feasible region and concentrates the search in regions that are the most promising. The most promising region is selected in each iteration based on information obtained from random sampling of the entire feasible region and local search. The method hence combines global and local search. We first develop the method for discrete problems and then show that the method can be extended to continuous global optimization. The method is shown to converge with probability one to a global optimum in finite time. In addition, we provide bounds on the expected number of iterations required for convergence, and we suggest two stopping criteria. Numerical examples are also presented to demonstrate the effectiveness of the method.

376 citations


Journal ArticleDOI
TL;DR: A new branching rule is devised that allows columns to be generated efficiently at each node of the branch-and-bound tree and cuts are described that help to strengthen the linear programming relaxation and to mitigate the effects of problem symmetry.
Abstract: We present a column-generation model and branch-and-price-and-cut algorithm for origin-destination integer multicommodity flow problems. The origin-destination integer multicommodity flow problem is a constrained version of the linear multicommodity flow problem in which flow of a commodity (defined in this case by an origin-destination pair) may use only one path from origin to destination. Branch-and-price-and-cut is a variant of branch-and-bound, with bounds provided by solving linear programs using column-and-cut generation at nodes of the branch-and-bound tree. Because our model contains one variable for each origin destination path, for every commodity, the linear programming relaxations at nodes of the branch-and-bound tree are solved using column generation, i.e., implicit pricing of nonbasic variables to generate new columns or to prove LP optimality. We devise a new branching rule that allows columns to be generated efficiently at each node of the branch-and-bound tree. Then, we describe cuts (cover inequalities) that can be generated at each node of the branch-and-bound tree. These cuts help to strengthen the linear programming relaxation and to mitigate the effects of problem symmetry. We detail the implementation of our combined column and- cut generation method and present computational results for a set of test problems arising from telecommunications applications. We illustrate the value of our branching rule when used to find a heuristic solution and compare branch-and-price and branch-and-price-and-cut methods to find optimal solutions for highly capacitated problems.

353 citations


Journal ArticleDOI
TL;DR: A framework for asymptotic optimization of a queueing system based on the staffing problem of call centers with 100's of agents is developed, and the square-root safety staffing principle is revisited, which is a long-existing rule-of-thumb for staffing the M/M/N queue.
Abstract: We develop a framework for asymptotic optimization of a queueing system. The motivation is the staffing problem of call centers with 100''s of agents (or more). Such a call center is modeled as an M/M/N queue, where the number of agents~$N$ is large. Within our framework, we determine the asymptotically optimal staffing level~$N^*$ that trades off agents'' costs with service quality: the higher the latter, the more expensive is the former. As an alternative to this optimization, we also develop a constraint satisfaction approach where one chooses the least~$N^*$ that adheres to a given constraint on waiting cost. Either way, the analysis gives rise to three regimes of operation: quality-driven, where the focus is on service quality; efficiency-driven, which emphasizes agents'' costs; and a rationalized regime that balances, and in fact unifies, the other two. Numerical experiments reveal remarkable accuracy of our asymptotic approximations: over a wide range of parameters, from the very small to the extremely large, $N^*$ is {\em exactly\/} optimal, or it is accurate to within a single agent. We demonstrate the utility of our approach by revisiting the square-root safety staffing principle, which is a long-existing rule-of-thumb for staffing the M/M/N queue. In its simplest form, our rule is as follows: if $c$ is the hourly cost of an agent, and $a$ is the hourly cost of customers'' delay, then $N^* = R + y^*({a \over c}) \sqrt R$, where $R$ is the offered load, and $y^*(\cdot)$ is a function that is easily computable.

304 citations


Journal ArticleDOI
TL;DR: This paper proposes to base the Dantzig-Wolfe decomposition of an integer program on the discretization of the integer polyhedron associated with a subsystem of constraints (as opposed to its convexification) to formulate the integrality restriction directly on the master variables and sets a theoretical framework for dealing with specific issues such as branching or the introduction of cutting planes in the master.
Abstract: Dantzig-Wolfe decomposition as applied to an integer program is a specific form of problem reformulation that aims at providing a tighter linear programming relaxation bound. The reformulation gives rise to an integer master problem, whose typically large number of variables is dealt with implicitly by using an integer programming column generation procedure, also known as branch-and-price algorithm. There is a large class of integer programs that are well suited for this solution technique. In this paper, we propose to base the Dantzig-Wolfe decomposition of an integer program on the discretization of the integer polyhedron associated with a subsystem of constraints (as opposed to its convexification). This allows us to formulate the integrality restriction directly on the master variables and sets a theoretical framework for dealing with specific issues such as branching or the introduction of cutting planes in the master. We discuss specific branching schemes and their effect on the structure of the column generation subproblem. We give theoretical bounds on the complexity of the separation process and the extent of the modifications to the column generation subproblem. Our computational tests on the cutting stock problem and a generalisation--the cutting strip problem--show that, in practice, all fractional solutions can be eliminated using branching rules that preserve the tractability of the subproblem, but there is a trade-off. between branching efficiency and subproblem tractability.

284 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed a supply network model that takes as input the bill of materials, the (nominal) lead times, the demand and cost data, and the required customer service levels.
Abstract: We develop a supply network model that takes as input the bill of materials, the (nominal) lead times, the demand and cost data, and the required customer service levels. In return, the model generates the base-stock level at each store--the stocking location for a part or an end-product, so as to minimize the overall inventory capital throughout the network and to guarantee the customer service requirements. The key ingredient of the model is a detailed, albeit approximate, analysis of theactual lead times at each store and the associated demand over such lead times, along with a characterization of the operation at each store via an inventory-queue model. The gradients are derived in explicit forms, and a conjugate gradient routine is used to search for the optimal solution. Several numerical examples are presented to validate the model and to illustrate its various features.

282 citations


Journal ArticleDOI
TL;DR: A tabu search is proposed for the Capacitated Arc Routing Problem, which outperforms all known heuristics and often produces a proven optimum.
Abstract: The Capacitated Arc Routing Problem arises in several contexts where streets or roads must be traversed for maintenance purposes or for the delivery of services. A tabu search is proposed for this difficult problem. On benchmark instances, it outperforms all known heuristics and often produces a proven optimum.

280 citations


Journal ArticleDOI
TL;DR: A mathematical programming approach for the classicalPSPACE-hard restless bandit problem in stochastic optimization is developed and a priority-index heuristic scheduling policy from the solution to the firstorder relaxation is proposed, where the indices are defined in terms of optimal dual variables.
Abstract: We develop a mathematical programming approach for the classicalPSPACE-hard restless bandit problem in stochastic optimization. We introduce a hierarchy ofN (whereN is the number of bandits) increasingly stronger linear programming relaxations, the last of which is exact and corresponds to the (exponential size) formulation of the problem as a Markov decision chain, while the other relaxations provide bounds and are efficiently computed. We also propose a priority-index heuristic scheduling policy from the solution to the firstorder relaxation, where the indices are defined in terms of optimal dual variables. In this way we propose a policy and a suboptimality guarantee. We report results of computational experiments that suggest that the proposed heuristic policy is nearly optimal. Moreover, the second-order relaxation is found to provide strong bounds on the optimal value.

265 citations


Journal ArticleDOI
TL;DR: It is found that most traditional and some recent heuristics give poor results when the number of facilities to locate is large and that Variable Neighbourhood search gives consistently best results, on average, in moderate computing time.
Abstract: The multisource Weber problem is to locate simultaneouslym facilities in the Euclidean plane to minimize the total transportation cost for satisfying the demand ofn fixed users, each supplied from its closest facility. Many heuristics have been proposed for this problem, as well as a few exact algorithms. Heuristics are needed to solve quickly large problems and to provide good initial solutions for exact algorithms. We compare various heuristics, i.e., alternative location-allocation (Cooper 1964), projection (Bongartz et al. 1994), Tabu search (Brimberg and Mladenovic 1996a),p-Median plus Weber (Hansen et al. 1996), Genetic search and several versions of Variable Neighbourhood search. Based on empirical tests that are reported, it is found that most traditional and some recent heuristics give poor results when the number of facilities to locate is large and that Variable Neighbourhood search gives consistently best results, on average, in moderate computing time.

254 citations


Journal ArticleDOI
TL;DR: This paper presents a stochastic model for the unit commitment that incorporates power trading into the picture and indicates that significant savings can be achieved when the spot market is entered into the problem and when Stochastic policy is adopted instead of a deterministic one.
Abstract: The electric power industry is going through deregulation. As a result, the load on the generating units of a utility is becoming increasingly unpredictable. Furthermore, electric utilities may need to buy power or sell their production to a power pool that serves as a spot market for electricity. These trading activities expose utilities to volatile electricity prices. In this paper, we present a stochastic model for the unit commitment that incorporates power trading into the picture. Our model also accounts for fuel constraints and prices that may vary with electricity prices and demand. The resulting model is a mixed-integer program that is solved using Lagrangian relaxation and Bender's decomposition. Using this solution approach, we solve problems with 729 demand scenarios on a single processor to within 0.1% of the optimal solution in less than 10 minutes. Our numerical results indicate that significant savings can be achieved when the spot market is entered into the problem and when stochastic policy is adopted instead of a deterministic one.

Journal ArticleDOI
TL;DR: A carrier designed for accepting a wide variety of vases employed in the floral trade is conveniently formed from plastic, providing for ready adaptation to a wide range of vase sizes and shapes and prevents tipping during transportation and delivery of floral arrangements.
Abstract: Handling freight in a crossdocking terminal is labor intensive and therefore costly because workers must unload, sort, and transfer a wide variety of freight from incoming to outgoing trailers. The efficiency of workers depends in large part on how trailers are assigned to doors around the dock; that is, on its layout. A good layout reduces travel distances without creating congestion, but until now no tools have been available to construct such layouts. We describe models of travel cost and three types of congestion typically experienced in crossdocking terminals, and we use them to construct layouts that minimize the labor cost of transferring freight. We report on the use of our models in the less-than-truckload trucking industry, including an implementation at a terminal in Stockton, California that improved productivity by more than 11%.

Journal ArticleDOI
TL;DR: The capacitated network design problem is a multicommodity minimal cost network flow problem with fixed charges on the arcs and is well known to be NP-hard and an efficient method based on a Lagrangian heuristic within a branch-and-bound framework is proposed.
Abstract: The capacitated network design problem is a multicommodity minimal cost network flow problem with fixed charges on the arcs and is well known to be NP-hard. The problem type is very common in the context of transportation networks, telecommunication networks, etc. In this paper we propose an efficient method for this problem, based on a Lagrangian heuristic within a branch-and-bound framework. The Lagrangian heuristic uses a Lagrangian relaxation to obtain easily solved subproblems and solves the Lagrangian dual by subgradient optimization. It also includes techniques for finding primal feasible solutions. The Lagrangian heuristic is then embedded into a branch-and-bound scheme that yields further primal improvements. Special penalty tests and cutting criteria are developed. The branch-and-bound scheme can either be an exact method that guarantees the optimal solution of the problem or be a quicker heuristic. The method has been tested on networks of various structures and sizes. Computational comparisons between this method and a state-of-the-art mixed-integer code are presented. The method is found to be capable of generating good feasible solutions to large-scale problems within reasonable time and data storage limits.

Journal ArticleDOI
TL;DR: This paper addresses the problem of admission control and sequencing in a production system that produces two classes of products, and model the joint admission control/sequencing decision in the context of a simple two-class M/ M/1 queue to gain insight into the problems.
Abstract: In this paper, we address the problem of admission control and sequencing in a production system that produces two classes of products. The first class of products is made-to-stock, and the firm is contractually obliged to meet demand for this class of products. The second class of products is made-to-order, and the firm has the option to accept (admit) or reject a particular order. The problem is motivated by suppliers in many industries who sign contracts with large manufacturers to supply them with a given product and also can take on additional orders from other sources on a make-to-order basis.We model the joint admission control/sequencing decision in the context of a simple two-classM/ M/1 queue to gain insight into the following problems: 1. How should a firm decide (a) when to accept or reject an additional order, and (b) which type of product to produce next? 2. How should a firm decide what annual quantity of orders to commit to when signing a contract to produce the make-to-stock products?

Journal ArticleDOI
TL;DR: It is shown that the basic idea of stability has little do with optimality of an optimization problem where the parameters are uncertain and the parametric analysis will provide alternative solutions that can be tested.
Abstract: Sensitivity analysis, combined with parametric optimization, is often presented as a way of checking if the solution of a deterministic linear program is reliable--even if some of the parameters are not fully known but are instead replaced by a best guess, often a sample mean. It is customary to claim that if the region over which a certain basis is optimal is large, one is fairly safe by using the solution of the linear program. If not, the parametric analysis will provide us with alternative solutions that can be tested. This way, sensitivity analysis is used to facilitate decision making under uncertainty by means of a deterministic tool, namely parametric linear programming. We show in this note that this basic idea of stability has little do with optimality of an optimization problem where the parameters are uncertain.

Journal ArticleDOI
TL;DR: A feasible solution of the Split Delivery Vehicle Routing Problem (SDVRP) is defined, and it is shown that the convex hull of the associated incidence vectors is a polyhedron ( PSDVRP), whose dimension depends on whether a vehicle visiting a client must service, or not, at least one unit of the client demand.
Abstract: In this paper we consider the Split Delivery Vehicle Routing Problem (SDVRP), a relaxation of the known Capacitated Vehicle Routing Problem (CVRP) in which the demand of any client can be serviced by more than one vehicle. We define a feasible solution of this problem, and we show that the convex hull of the associated incidence vectors is a polyhedron ( PSDVRP), whose dimension depends on whether a vehicle visiting a client must service, or not, at least one unit of the client demand. From a partial and linear description ofPSDVRP and a new family of valid inequalities, we develop a lower bound whose quality is exhibited in the computational results provided, which include the optimal resolution of some known instances--one of them with 50 clients. This instance is, as far as we know, the biggest one solved so far.

Journal ArticleDOI
Fangruo Chen1
TL;DR: This research extends the multi-echelon inventory theory in several ways, and generalizes the Clark-Scarf model by allowing batch transfers of inventories by identifying optimal policies for multi-stage serial and assembly systems where materials flow in fixed batches.
Abstract: In many production/distribution systems, materials flow from one stage to another in fixed lot sizes. For example, a retailer orders a full truckload from a manufacturer to qualify for a quantity discount; a factory has a material handling system that moves full containers of parts from one production stage to the next. In this paper, we derive optimal policies for multi-stage serial and assembly systems where materials flow in fixed batches. The optimal policies have a simple structure, and their parameters can be easily determined. This research extends the multi-echelon inventory theory in several ways. It generalizes the Clark-Scarf model by allowing batch transfers of inventories. Rosling (1989) shows that assembly systems can be interpreted as serial systems under the assumption that there are no setup costs. We show that the series interpretation still holds when materials flow in fixed batches which satisfy a certain regularity condition. Finally, Veinott (1965) identifies an optimal policy for a single-location inventory system with batch ordering. This paper generalizes his result to multi-echelon settings.

Journal ArticleDOI
TL;DR: A large-scale simulation model is constructed using data from over 30,000 transplants and the simulation results demonstrate that the dynamic index policy increases the quality-adjusted life expectancy and reduces the mean waiting time until transplantation for all six demographic groups under consideration.
Abstract: The crux of the kidney allocation problem is the trade-off between clinical efficiency and equity. We consider a dynamic resource allocation problem with the tri-criteria objective of maximizing the quality-adjusted life expectancy of transplant candidates (clinical efficiency) and minimizing two measures of inequity: a linear function of the likelihood of transplantation of the various types of patients, and a quadratic function that quantifies the differences in mean waiting times across patient types. The dynamic status of patients is modeled by a set of linear differential equations, and an approximate analysis of the optimal control problem yields a dynamic index policy. We construct a large-scale simulation model using data from over 30,000 transplants, and the simulation results demonstrate that, relative to the organ allocation policy currently employed in the United States, the dynamic index policy increases the quality-adjusted life expectancy and reduces the mean waiting time until transplantation for all six demographic groups (two sexes, races, and age groups) under consideration.

Journal ArticleDOI
TL;DR: An efficient implementation of the push-relabel algorithm adapted to the features of the open-pit mining problem is proposed and significant improvements compared to existing algorithms are offered to make the related sensitivity analysis problem practically solvable.
Abstract: The open-pit mining problem is to determine the contours of a mine, based on economic data and engineering feasibility requirements, to yield maximum possible net income. This practical problem needs to be solved for very large data sets. In practice, moreover, it is necessary to test multiple scenarios, taking into account a variety of realizations of geological predictions and forecasts of ore value.The industry is experiencing computational difficulties in solving the problem. Yet, the problem is known to be equivalent to the minimum cut or maximum flow problem. For the maximum flow problem there are a number of very efficient algorithms that have been developed over the last decade. On the other hand, the algorithm that is most commonly used by the mining industry has been devised by Lerchs and Grossmann (LG). This algorithm is used in most commercial software packages for open-pit mining.This paper describes a detailed study of the LG algorithm as compared to the maximum flow "push-relabel" algorithm. We propose here an efficient implementation of the push-relabel algorithm adapted to the features of the open-pit mining problem. We study issues such as the properties of the mine and ore distribution and how they affect the running time of the algorithm. We also study some features that improve the performance of the LG algorithm. The proposed implementations offer significant improvements compared to existing algorithms and make the related sensitivity analysis problem practically solvable.

Journal ArticleDOI
TL;DR: This article presents a model that reflects this yield management problem and results include an exact solution for the continuous-time model; piecewise concavity of the value function with respect to time and inventory; and monotonicity of the optimal policy.
Abstract: It is a common practice for industries to price the same products at different levels. For example, airlines charge various fares for a common pool of seats. Seasonal products are sold at full or discount prices during different phases of the season. This article presents a model that reflects this yield management problem. The model assumes that (1) products are offered at multiple predetermined prices over time; (2) demand is price sensitive and obeys the Poisson process; and (3) price is allowed to change monotonically, i.e., either the markup or markdown policy is implemented. To maximize the expected revenue, management needs to determine the optimal times to switch between prices based on the remaining season and inventory. Major results in this research include (1) an exact solution for the continuous-time model; (2) piecewise concavity of the value function with respect to time and inventory; and (3) monotonicity of the optimal policy. The implementation of optimal policies is fairly facile because of the existence of threshold points embedded in the value function. The value function and time thresholds can be solved with a reasonable computation effort. Numerical examples are provided.

Journal ArticleDOI
TL;DR: This research addresses the trade-off between product development time and costs and introduces an algorithm to determine an appropriate overlapping strategy under different scenarios.
Abstract: Increasingly shorter product life cycles impel firms to design, develop, and market more products in less time than ever before. Overlapping of design and development stages is commonly regarded as the most promising strategy to reduce product development times. However, overlapping typically requires additional resources and can be costly.Our research addresses the trade-off between product development time and costs and introduces an algorithm to determine an appropriate overlapping strategy under different scenarios. The methodology developed was successfully employed at Rocketdyne Division of Rockwell International.

Journal ArticleDOI
TL;DR: A taxonomy of environments in which IIT scheduling is relevant is provided, the extant literature on I IT scheduling is reviewed, and areas of opportunity for future research are identified.
Abstract: In the context of production scheduling, inserted idle time (IIT) occurs whenever a resource is deliberately kept idle in the face of waiting jobs. IIT schedules are particularly relevant in multimachine industrial situations where earliness costs and/or dynamically arriving jobs with due dates come into play. We provide a taxonomy of environments in which IIT scheduling is relevant, review the extant literature on IIT scheduling, and identify areas of opportunity for future research.

Journal ArticleDOI
TL;DR: The model is tested on blocking problems from a major railroad, and the results show that the blocking plans generated have the potential to reduce the railroad's operating costs by millions of dollars annually.
Abstract: In this study, we formulate the railroad blocking problems as a network design problem with maximum degree and flow constraints on the nodes and propose a heuristic Lagrangian relaxation approach to solve the problem. The newapproach decomposes the complicated mixed integer programming problem into two simple subproblems so that the storage requirement and computational effort are greatly reduced. A set of inequalities are added to one subproblem to tighten the lower bounds and facilitate generating feasible solutions. Subgradient optimization is used to solve the Lagrangian dual. An advanced dual feasible solution is generated to speed up the convergence of the subgradient method. The model is tested on blocking problems from a major railroad, and the results show that the blocking plans generated have the potential to reduce the railroad's operating costs by millions of dollars annually.

Journal ArticleDOI
TL;DR: This paper model the component design problem as a mathematical program that considers production, inventory holding, setup, and complexity costs (the cost in indirect functions caused by component variety), and develops two approaches to solve the problem: a branch-and-bound algorithm that can solve small- and medium-size problems optimally, and a simulated annealing algorithm that cannot be solved heuristically.
Abstract: Increased competition and more demanding customers have forced companies to offer a wide variety of products. Component commonality can help companies reduce the cost of providing product variety to their customers. However, determining the extent to which component commonality should be used is difficult. In this paper we present an approach to determine the optimal level of component commonality for end-product components that do not differentiate models from the customer's perspective. The work was inspired by and applied to a wire-harness design problem faced by a major automobile manufacturer. We model the component design problem as a mathematical program that considers production, inventory holding, setup, and complexity costs (the cost in indirect functions caused by component variety). We develop two approaches to solve the problem: a branch-and-bound algorithm that can solve small- and medium-size problems optimally, and a simulated annealing algorithm that can solve large-size problems heuristically. We apply both algorithms to the wireharness design problem faced by the automobile manufacturer and to a number of randomly generated test problems. We show that an optimal design achieves high cost savings by using significantly fewer variants than a no-commonality design but significantly more variants than a full-commonality design. We apply sensitivity analysis to identify extreme conditions under which the no-commonality and full-commonality designs perform well, and we identify the key cost drivers for our application. Finally, we describe the impact of our analysis on the company's subsequent component design decisions.

Journal ArticleDOI
TL;DR: An adaptive pricing and ordering policy is developed with the asymptotic property that the average realized profit per period converges with probability one to the optimal value under complete information on the distribution.
Abstract: We consider the combined problem of pricing and ordering for a perishable product with unknown demand distribution and censored demand observations resulting from lost sales, faced by a monopolistic retailer. We develop an adaptive pricing and ordering policy with the asymptotic property that the average realized profit per period converges with probability one to the optimal value under complete information on the distribution. The pricing mechanism is modeled as a multiarmed bandit problem, while the order quantity decision, made after the price level is established, is based on a stochastic approximation procedure with multiplicative updates.

Journal ArticleDOI
TL;DR: This work presents a method for exact evaluation of control policies that provides the complete probability distributions of the retailer inventory levels in a two-level inventory system with one central warehouse andN retailers.
Abstract: We consider a two-level inventory system with one central warehouse andN retailers. All installations apply different continuous review installation stock ( R,Q) policies. The retailers face independent compound Poisson demand processes. Transportation times are constant. We present a method for exact evaluation of control policies that provides the complete probability distributions of the retailer inventory levels.

Journal ArticleDOI
TL;DR: An integer programming formulation for the problem that involves an exponential number of binary variables and associated columns, each of which corresponds to selecting a fixed number of copies of a specific cutting pattern is proposed.
Abstract: The cutting stock problem is that of finding a cutting of stock material to meet demands for small pieces of prescribed dimensions while minimising the amount of waste. Because changing over from one cutting pattern to another involves significant setups, an auxiliary problem is to minimise the number of different patterns that are used. The pattern minimisation problem is significantly more complex, but it is of great practical importance. In this paper, we propose an integer programming formulation for the problem that involves an exponential number of binary variables and associated columns, each of which corresponds to selecting a fixed number of copies of a specific cutting pattern. The integer program is solved using a column generation approach where the subproblem is a nonlinear integer program that can be decomposed into multiple bounded integer knapsack problems. At each node of the branch-and-bound tree, the linear programming relaxation of our formulation is made tighter by adding super-additive inequalities. Branching rules are presented that yield a balanced tree. Incumbent solutions are obtained using a rounding heuristic. The resulting branch-and-price-and-cut procedure is used to produce optimal or approximately optimal solutions for a set of real-life problems.

Journal ArticleDOI
TL;DR: This work developed a reserve selection formulation that incorporates probabilistic presence-absence data and was a discrete 0/1 optimization model that maximized the number of represented vegetation communities subject to a budget constraint.
Abstract: Interest in protecting natural areas is increasing as development pressures and conflicting land uses threaten and fragment ecosystems. A variety of quantitative approaches have been developed to help managers select sites for biodiversity protection. The problem is often formulated to select the set of reserve sites that maximizes the number of species or ecological communities that are represented, subject to an upper bound on the number or area of selected sites. Most formulations assume that information about the presence or absence of species in the candidate sites is known with certainty. Because complete information typically is lacking, we developed a reserve selection formulation that incorporates probabilistic presence-absence data. The formulation was a discrete 0/1 optimization model that maximized the number of represented vegetation communities subject to a budget constraint, where a community was considered represented if its probability of occurrence in the set of selected sites exceeded a specified minimum reliability threshold. Although the formulation was nonlinear, a log transformation allowed us to represent the problem in a linear format that could be solved using exact optimization methods. The formulation was tested using a moderately sized reserve selection problem based on data from the Superior National Forest in Minnesota.

Journal ArticleDOI
TL;DR: Use of Clark and Scarf's (1960) idea of echelon stocks reduces a complex, multidimensional stocking problem to the analysis of a series of one-dimensional subproblems, and top-down base stock policies, which are readily amenable to managerial interpretation, are shown to be optimal.
Abstract: After reformulating Clark and Scarf's (1960) classical serial multi-echelon model so that the lead time between adjacent echelons is one week (period), the option to expedite between each resulting echelon is added. Thus, each week requires a decision to be made at each echelon on how many units to expedite in from the next upstream echelon (to be received immediately) and how many to regular order (to be received in one week), with the remainder detained (left as is). The model can be interpreted as addressing dynamic lead time management, in which the (remaining) effective lead time for each ordered unit can be dynamically reduced by expediting and/or extended. Use of Clark and Scarf's (1960) idea of echelon stocks reduces a complex, multidimensional stocking problem to the analysis of a series of one-dimensional subproblems. What are calledtop-down base stock policies, which are readily amenable to managerial interpretation, are shown to be optimal. Myopic policies are shown to be optimal in the stationary, in1nite horizon case. The results are illustrated numerically.

Journal ArticleDOI
TL;DR: The optimal return function and the optimal acceptance strategy for this problem are characterized and solution methods are given, and the classic airline yield problem is formulated and solved.
Abstract: The finite horizon stochastic knapsack combines a secretary problem with an integer knapsack problem. It is useful for optimizing sales of perishable commodities with low marginal costs to impatient customers. Applications include yield management for airlines, hotels/motels, broadcasting advertisements, and car rentals. In these problems,K types of customers arrive stochastically. Customer type,k, has an integer weightw k , a valueb k , and an arrival rate? k( t) (which depends on time). We consider arrivals over a continuous time horizon [0;T] to a "knapsack" with capacityW. For each arrival that fits in the remaining knapsack capacity, we may (1) accept it, receivingb k , while giving up capacityw k ; or (2) reject it, forgoing the value and not losing capacity. The choice must be immediate; a customer not accepted on arrival is lost. We model the problem using continuous time, discrete state, finite horizon, dynamic programming. We characterize the optimal return function and the optimal acceptance strategy for this problem, and we give solution methods. We generalize to multidimensional knapsack problems. We also consider the special case wherew k= 1 for allk. This is the classic airline yield problem. Finally, we formulate and solve a new version of the secretary problem.