scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Operational Research Society in 2005"


Journal ArticleDOI
TL;DR: Comparison results are based on a theoretical analysis of the mean square error due to its mathematically tractable nature and the categorization rules proposed are expressed in terms of the average inter-demand interval and the squared coefficient of variation of demand sizes.
Abstract: The categorization of alternative demand patterns facilitates the selection of a forecasting method and it is an essential element of many inventory control software packages. The common practice in the inventory control software industry is to arbitrarily categorize those demand patterns and then proceed to select an estimation procedure and optimize the forecast parameters. Alternatively, forecasting methods can be directly compared, based on some theoretically quantified error measure, for the purpose of establishing regions of superior performance and then define the demand patterns based on the results. It is this approach that is discussed in this paper and its application is demonstrated by considering EWMA, Croston's method and an alternative to Croston's estimator developed by the first two authors of this paper. Comparison results are based on a theoretical analysis of the mean square error due to its mathematically tractable nature. The categorization rules proposed are expressed in terms of the average inter-demand interval and the squared coefficient of variation of demand sizes. The validity of the results is tested on 3000 real-intermittent demand data series coming from the automotive industry.

329 citations


Journal ArticleDOI
TL;DR: A DEA model is developed to estimate the relative efficiency of the countries in converting income to knowledge and life opportunities and the transformation paradigm is introduced in the assessment of human development.
Abstract: To consider different aspects of life when measuring human development, the United Nations Development Program introduced the Human Development Index (HDI). The HDI is a composite index of socioeconomic indicators that reflect three major dimensions of human development: longevity, knowledge and standard of living. In this paper, the assessment of the HDI is reconsidered in the light of data envelopment analysis (DEA). Instead of a simple rank of the countries, human development is benchmarked on the basis of empirical observations of best practice countries. First, on the same line as HDI, we develop a DEA-like model to assess the relative performance of the countries in human development. Then we extend our calculations with a post-DEA model to derive global estimates of a new development index by using common weights for the socioeconomic indicators. Finally, we introduce the transformation paradigm in the assessment of human development. We develop a DEA model to estimate the relative efficiency of the countries in converting income to knowledge and life opportunities.

317 citations


Journal ArticleDOI
TL;DR: An example of forest management illustrates that the compromise solution approach is able to generate a common set of weights, which not only differentiates efficient DMUs but also detects abnormal efficiency scores on a common base.
Abstract: A characteristic of data envelopment analysis (DEA) is to allow individual decision-making units (DMUs) to select the factor weights that are the most advantageous for them in calculating their efficiency scores. This flexibility in selecting the weights, on the other hand, deters the comparison among DMUs on a common base. In order to rank all the DMUs on the same scale, this paper proposes a compromise solution approach for generating common weights under the DEA framework. The efficiency scores calculated from the standard DEA model are regarded as the ideal solution for the DMUs to achieve. A common set of weights which produces the vector of efficiency scores for the DMUs closest to the ideal solution is sought. Based on the generalized measure of distance, a family of efficiency scores called ‘compromise solutions’ can be derived. The compromise solutions have the properties of unique solution and Pareto optimality not enjoyed by the solutions derived from the existing methods of common weights. An example of forest management illustrates that the compromise solution approach is able to generate a common set of weights, which not only differentiates efficient DMUs but also detects abnormal efficiency scores on a common base.

241 citations


Journal ArticleDOI
TL;DR: The progress of simulation from its early days is charted with a particular focus on recent history, and the desirability of continuing to follow developments in computing, without significant developments in the wider methodology of simulation, is questioned.
Abstract: Discrete-event simulation is one of the most popular modelling techniques. It has developed significantly since the inception of computer simulation in the 1950s, most of this in line with developments in computing. The progress of simulation from its early days is charted with a particular focus on recent history. Specific developments in the past 15 years include visual interactive modelling, simulation optimization, virtual reality, integration with other software, simulation in the service sector, distributed simulation and the use of the worldwide web. The future is then speculated upon. Potential changes in model development, model use, the domain of application for simulation and integration with other simulation approaches are all discussed. The desirability of continuing to follow developments in computing, without significant developments in the wider methodology of simulation, is questioned.

223 citations


Journal ArticleDOI
TL;DR: The double-hurdle model, originally due to Cragg, and conventionally applied to household consumption or labour supply decisions, contains two equations, one which determines whether or not a customer is a potential defaulter (the ‘first hurdle’), and the other which determines the extent of default.
Abstract: Some models of loan default are binary, simply modelling the probability of default, while others go further and model the extent of default (eg number of outstanding payments; amount of arrears). The double-hurdle model, originally due to Cragg (Econometrica, 1971), and conventionally applied to household consumption or labour supply decisions, contains two equations, one which determines whether or not a customer is a potential defaulter (the ‘first hurdle’), and the other which determines the extent of default. In separating these two processes, the model recognizes that there exists a subset of the observed non-defaulters who would never default whatever their circumstances. A Box-Cox transformation applied to the dependent variable is a useful generalization to the model. Estimation is relatively easy using the Maximum Likelihood routine available in STATA. The model is applied to a sample of 2515 loan applicants for whom loans were approved, a sizeable proportion of whom defaulted in varying degrees. The dependent variables used are amount in arrears and number of days in arrears. The value of the hurdle approach is confirmed by finding that certain key explanatory variables have very different effects between the two equations. Most notably, the effect of loan amount is strongly positive on arrears, while being U-shaped on the probability of default. The former effect is seriously under-estimated when the first hurdle is ignored.

163 citations


Journal ArticleDOI
TL;DR: The open vehicle routing problem (OVRP) is studied, in which the vehicles are not required to return to the depot, but if they do, it must be by revisiting the customers assigned to them in the reverse order.
Abstract: In this paper, another version of the vehicle routing problem (VRP)—the open vehicle routing problem (OVRP) is studied, in which the vehicles are not required to return to the depot, but if they do, it must be by revisiting the customers assigned to them in the reverse order. By exploiting the special structure of this type of problem, we present a new tabu search heuristic for finding the routes that minimize two objectives while satisfying three constraints. The computational results are provided and compared with two other methods in the literature.

161 citations


Journal ArticleDOI
TL;DR: This research illustrates how, through their involvement in this development process, management came to understand that seemingly contradictory goals such as customer satisfaction, employee satisfaction and employee productivity were better seen as mutually reinforcing.
Abstract: The balanced scorecard (BSC) has become a popular concept for performance measurement. It focuses attention of management on only a few performance measures and bridges different functional areas as it includes both financial and non-financial measures. However, doubts frequently arise regarding the quality of the BSCs developed as well as the quality of the process in which this development takes place. This article describes a case study in which system dynamics (SD) modelling and simulation was used to overcome both kinds of problems. In a two-stage modelling process (qualitative causal loop diagramming followed by quantitative simulation), a BSC was developed for management of one organizational unit of a leading Dutch insurer. This research illustrates how, through their involvement in this development process, management came to understand that seemingly contradictory goals such as customer satisfaction, employee satisfaction and employee productivity were, in fact, better seen as mutually reinforcing. Also, analysis of the SD model showed how, contrary to ex ante management intuition, performance would first have to drop further before significant improvements could be realized. Finally, the quantitative modelling process also helped to evaluate several improvement initiatives that were under consideration at the time, proving some of them to have unclear benefits, others to be very promising indeed.

159 citations


Journal ArticleDOI
TL;DR: The development of the methodology is surveyed, the current environment for consumer lending is described, and some of the modelling areas and issues that are actively being researched or should be are identified.
Abstract: Methods for assessing the credit risk when lending to consumers has been in operation for 50 years. Yet, there are probably now more opportunities and challenges for research into the development of this area than ever before. This paper surveys the development of the methodology, describes the current environment for consumer lending and seeks to identify some of the modelling areas and issues that are actively being researched or should be.

155 citations


Journal ArticleDOI
TL;DR: A heuristic algorithm with worst-case bound m for each criteria is given and a polynomial algorithm is proposed for both of the special cases: identical processing time on each machine and an increasing series of dominating machines.
Abstract: The paper is devoted to some flow-shop scheduling problems with a learning effect. The objective is to minimize one of the two regular performance criteria, namely, makespan and total flowtime. A heuristic algorithm with worst-case bound m for each criteria is given, where m is the number of machines. Furthermore, a polynomial algorithm is proposed for both of the special cases: identical processing time on each machine and an increasing series of dominating machines. An example is also constructed to show that the classical Johnson's rule is not the optimal solution for the two-machine flow-shop scheduling to minimize makespan with a learning effect. Some extensions of the problem are also shown.

153 citations



Journal ArticleDOI
TL;DR: A robust optimization model is presented for the strategic capacity planning and warehouse location problem in supply chains operating under uncertainty and how robust solutions may be determined with an efficient decomposition algorithm using a special Lagrangian relaxation method.
Abstract: We discuss the strategic capacity planning and warehouse location problem in supply chains operating under uncertainty. In particular, we consider situations in which demand variability is the only source of uncertainty. We first propose a deterministic model for the problem when all relevant parameters are known with certainty, and discuss related tractability and computational issues. We then present a robust optimization model for the problem when the demand is uncertain, and demonstrate how robust solutions may be determined with an efficient decomposition algorithm using a special Lagrangian relaxation method in which the multipliers are constructed from dual variables of a linear program.

Journal ArticleDOI
TL;DR: Analytical results show that a stationary solution to the Kuhn–Tucker necessary conditions can be found and it is shown to be the optimal solution.
Abstract: This paper investigates the problem of jointly determining the order size and optimal prices for a perishable inventory system under the condition that demand is time and price dependent. It is assumed that a decision-maker has the opportunity to adjust prices before the end of the sales season to influence demand and to improve revenues. A mathematical model is developed to find the optimal number of prices, the optimal prices and the order quantity. Analytical results show that a stationary solution to the Kuhn–Tucker necessary conditions can be found and it is shown to be the optimal solution. The analytical results lead us to derive a solution procedure for determining the optimal order size and prices.

Journal ArticleDOI
TL;DR: This paper proposes a classification scheme for the different types of disruptions and defines the constraints and objectives that comprise what is called the recovery problem, which is a resource-constrained project scheduling problem with finish–start precedence constraints.
Abstract: In this paper, we study the problem of how to react when an ongoing project is disrupted. The focus is on the resource-constrained project scheduling problem with finish–start precedence constraints. We begin by proposing a classification scheme for the different types of disruptions and then define the constraints and objectives that comprise what we call the recovery problem. The goal is to get back on track as soon as possible at minimum cost, where cost is now a function of the deviation from the original schedule. The problem is formulated as an integer linear program and solved with a hybrid mixed-inter programming/constraint programming procedure that exploits a number of special features in the constraints. The new model is significantly different from the original one due to the fact that a different set of feasibility conditions and performance requirements must be considered during the recovery process. The complexity of several special cases is analysed. To test the hybrid procedure, 554 20-activity instances were solved and the results compared with those obtained with CPLEX. Computational experiments were also conducted to determine the effects of different factors related to the recovery process.

Journal ArticleDOI
TL;DR: This paper provides the optimal policy for the customer to obtain its minimum cost when the supplier offers not only a permissible delay but also a cash discount and compares the optimal order quantity under supplier credits to the classical economic order quantity.
Abstract: In the classical inventory economic order quantity (or EOQ) model, it was assumed that the supplier is paid for the items immediately after the items are received. However, in practices, the supplier may simultaneously offer the customer: (1) a permissible delay in payments to attract new customers and increase sales, and (2) a cash discount to motivate faster payment and reduce credit expenses. In this paper, we provide the optimal policy for the customer to obtain its minimum cost when the supplier offers not only a permissible delay but also a cash discount. We first establish a proper model, and then characterize the optimal solution and provide an easy-to-use algorithm to find the optimal order quantity and replenishment time. Furthermore, we also compare the optimal order quantity under supplier credits to the classical economic order quantity. Finally, several numerical examples are given to illustrate the theoretical results.

Journal ArticleDOI
TL;DR: In this paper, the authors compare the performance of a neural network survival analysis model with that of the proportional hazards model for predicting both loan default and early repayment using data from a UK financial institution.
Abstract: Traditionally, credit scoring aimed at distinguishing good payers from bad payers at the time of the application. The timing when customers default is also interesting to investigate since it can provide the bank with the ability to do profit scoring. Analysing when customers default is typically tackled using survival analysis. In this paper, we discuss and contrast statistical and neural network approaches for survival analysis. Compared to the proportional hazards model, neural networks may offer an interesting alternative because of their universal approximation property and the fact that no baseline hazard assumption is needed. Several neural network survival analysis models are discussed and evaluated according to their way of dealing with censored observations, time-varying inputs, the monotonicity of the generated survival curves and their scalability. In the experimental part, we contrast the performance of a neural network survival analysis model with that of the proportional hazards model for predicting both loan default and early repayment using data from a UK financial institution.

Journal ArticleDOI
TL;DR: The problem is to select a subset of requests of maximal profit for a given orbit by means of a tabu search heuristic and computational results are reported.
Abstract: Earth observation satellites are platforms equipped with optical instruments that orbit the planet. During the course of an orbit, they take photographs of some regions of the Earth at the request ...

Journal ArticleDOI
TL;DR: This paper proposes two greedy algorithms for packing unequal circles into a two-dimensional rectangular container that selects the next circle to place according to the maximum-hole degree rule, inspired from human activity in packing.
Abstract: In this paper, we study the problem of packing unequal circles into a two-dimensional rectangular container. We solve this problem by proposing two greedy algorithms. The first algorithm, denoted by B1.0, selects the next circle to place according to the maximum-hole degree rule, that is inspired from human activity in packing. The second algorithm, denoted by B1.5, improves B1.0 with a self-look-ahead search strategy. The comparisons with the published methods on several instances taken from the literature show the good performance of our approach.

Journal ArticleDOI
TL;DR: Common performance measures used in the retail banking sector include the Gini coefficient, the Kolmogorov–Smirnov statistic, the mean difference, and the information value, but all of these measures use irrelevant information about the magnitude of scores, and fail to use crucial information relating to numbers misclassified.
Abstract: In retail banking, predictive statistical models called ‘scorecards’ are used to assign customers to classes, and hence to appropriate actions or interventions. Such assignments are made on...

Journal ArticleDOI
TL;DR: A single-parameter metaheuristic method is employed that exploits a list of threshold values to guide intelligently an advanced local search for the open vehicle routeing problem (OVRP), and consistently outperforms previous approaches for the OVRP.
Abstract: In this paper, we consider the open vehicle routeing problem (OVRP), in which routes are not sequences of locations starting and ending at the depot but open paths. The problem is of particular importance for planning fleets of hired vehicles, a common practice in the distribution and service industry. In such cases, the travelling cost is a function of the vehicle open paths. To solve the problem, we employ a single-parameter metaheuristic method that exploits a list of threshold values to guide intelligently an advanced local search. Computational results on a set of benchmark problems show that the proposed method consistently outperforms previous approaches for the OVRP. A real-world example demonstrates the applicability of the method in practice, demonstrating that the approach can be used to solve actual problems of routing large vehicle fleets.

Journal ArticleDOI
TL;DR: An empirical study of four machine learning feature selection methods that provide an automatic data mining technique for reducing the feature space and illustrates how these methods help to improve three aspects of the performance of scoring models: model simplicity, model speed and model accuracy.
Abstract: The features used may have an important effect on the performance of credit scoring models. The process of choosing the best set of features for credit scoring models is usually unsystematic and dominated by somewhat arbitrary trial. This paper presents an empirical study of four machine learning feature selection methods. These methods provide an automatic data mining technique for reducing the feature space. The study illustrates how four feature selection methods—‘ReliefF’, ‘Correlation-based’, ‘Consistency-based’ and ‘Wrapper’ algorithms help to improve three aspects of the performance of scoring models: model simplicity, model speed and model accuracy. The experiments are conducted on real data sets using four classification algorithms—‘model tree (M5)’, ‘neural network (multi-layer perceptron with back-propagation)’, ‘logistic regression’, and ‘k-nearest-neighbours’.

Journal ArticleDOI
TL;DR: Support vector machines from statistical learning theory divide a set of labelled credit applicants into subsets of ‘typical’ and ‘critical’ patterns, which leads to improved generalization and linear discriminant analysis with prior training subset selection via SVM.
Abstract: Credit applicants are assigned to good or bad risk classes according to their record of defaulting. Each applicant is described by a high-dimensional input vector of situational characteristics and by an associated class label. A statistical model, which maps the inputs to the labels, can decide whether a new credit applicant should be accepted or rejected, by predicting the class label given the new inputs. Support vector machines (SVM) from statistical learning theory can build such models from the data, requiring extremely weak prior assumptions about the model structure. Furthermore, SVM divide a set of labelled credit applicants into subsets of ‘typical’ and ‘critical’ patterns. The correct class label of a typical pattern is usually very easy to predict, even with linear classification methods. Such patterns do not contain much information about the classification boundary. The critical patterns (the support vectors) contain the less trivial training examples. For instance, linear discriminant analysis with prior training subset selection via SVM also leads to improved generalization. Using non-linear SVM, more ‘surprising’ critical regions may be detected, but owing to the relative sparseness of the data, this potential seems to be limited in credit scoring practice.

Journal ArticleDOI
TL;DR: This paper presents a hybrid algorithm which is comprised of the two metaheuristics of tabu search and variable neighbourhood descent to solve the backhauling problem associated with mixed and simultaneous delivery and pick-ups.
Abstract: Metaheuristics are a class of approximate methods designed to solve hard combinatorial optimization problems arising within various different areas. The importance of metaheuristics results from their ability to continue the search beyond a local optimum so that near-optimal or optimal solutions are efficiently found. In order to solve the backhauling problem associated with mixed and simultaneous delivery and pick-ups, this paper presents a hybrid algorithm which is comprised of the two metaheuristics of tabu search and variable neighbourhood descent. The primary challenge associated with backhauling consists of creating routes in which vehicles are not only required to deliver goods, but also to perform pick-ups at customer locations. The problems associated with these two categories of problems, however, have received little attention in the literature to date. A set of examples taken from the literature with Euclidean cost matrices are presented. Finally, some numerical results are illustrated to show the effectiveness of the proposed approach.

Journal ArticleDOI
TL;DR: Results based on real-examination scheduling problems including standard benchmark data show that the final implementation is able to compete effectively with the best-known solution approaches to the problem.
Abstract: Ant colony optimization is an evolutionary search procedure based on the way that ant colonies cooperate in locating shortest routes to food sources. Early implementations focussed on the travelling salesman and other routing problems but it is now being applied to an increasingly diverse range of combinatorial optimization problems. This paper is concerned with its application to the examination scheduling problem. It builds on an existing implementation for the graph colouring problem to produce clash-free timetables and goes on to consider the introduction of a number of additional practical constraints and objectives. A number of enhancements and modifications to the original algorithm are introduced and evaluated. Results based on real-examination scheduling problems including standard benchmark data (the Carter data set) show that the final implementation is able to compete effectively with the best-known solution approaches to the problem.

Journal ArticleDOI
TL;DR: It is shown that given a certain sample size, sample bias has a significant effect on consumer credit-scoring performance and profitability, and its effect is composed of the inclusion of rejected orders in the scoring model, and—to a lesser extent—the inclusion of these orders into the variable-selection process.
Abstract: This article seeks to gain insight into the influence of sample bias in a consumer credit scoring model. In earlier research, sample bias has been suggested to pose a sizeable threat to predictive performance and profitability due to its implications on either population drainage or biased estimates. Contrary to previous—mainly theoretical—research on sample bias, the unique features of the data set used in this study provide the opportunity to investigate the issue in an empirical setting. Based on the data of a mail-order company offering short-term consumer credit to their consumers, we show that (i) given a certain sample size, sample bias has a significant effect on consumer credit-scoring performance and profitability, (ii) its effect is composed of the inclusion of rejected orders in the scoring model, and—to a lesser extent—the inclusion of these orders into the variable-selection process, and (iii) the impact of the effect of sample bias on consumer credit-scoring performance and profitability is modest.

Journal ArticleDOI
TL;DR: The development of a model that calculates the required number of supplementary nurses per shift, and also encapsulates the time-dependent nature of elective surgery admissions and complex duration-of-stay profiles, is presented in this paper.
Abstract: The intensive care unit (ICU) of a hospital is an essential yet costly resource. Consequently, intensive care modelling has become increasingly prevalent in recent years in attempts to increase efficiency and reduce costs. Previous models have usually assumed that the numbers of beds available are restricted; when all beds are occupied, any additional patients are referred elsewhere or elective surgeries are cancelled. In this study, activities at the ICU at a large teaching hospital were modelled using data relating to all admissions to the ICU during the year 2000—a total of 1084 admissions. The unit is unusual in that the majority of patients referred for intensive care therapy are admitted. Bed numbers are increased when necessary to cope with demand. However, nurses are a restricted resource. In order to maintain the required nurse:patient ratio of at least one:one, supplementary nurses are employed during busy periods. Supplementary nurse costs are substantial and so nurse utilization must be closely monitored. The development of a model that calculates the required number of supplementary nurses per shift, and also encapsulates the time-dependent nature of elective surgery admissions and complex duration-of-stay profiles, is presented in this paper. In particular, the model is used to determine the number of rostered nurses that are required to minimize overall nursing staff costs.

Journal ArticleDOI
TL;DR: A general company-wide management information system for defining procurement strategies, which uses total cost of ownership information and argues that mathematical programming models should be used for exploiting this information, when evaluating the firm's strategic procurement options.
Abstract: We present a general company-wide management information system for defining procurement strategies. We believe that existing practices for determining purchasing strategies can be improved and a new approach developed. The system uses total cost of ownership information. We argue that mathematical programming models should be used for exploiting this information, when evaluating the firm's strategic procurement options. As an example, we show how we have successfully applied our approach to develop a decision support system at Usinor, a European multinational steel company.

Journal ArticleDOI
TL;DR: A new enhancement of the Clarke–Wright savings method for the classical capacitated vehicle routing problem is proposed which differs from the previous ones in its saving criterion: Customer demands are considered in addition to distances.
Abstract: In this work we are concerned with the Clarke–Wright savings method for the classical capacitated vehicle routing problem. This is an NP-hard problem and numerous heuristic solution methods have be...

Journal ArticleDOI
TL;DR: This work describes and compares three approaches to the modeling and solution of the car sequencing problem, and proposes an adaptation of the Ant Colony Optimization for the problem.
Abstract: The car sequencing problem is the ordering of the production of a list of vehicles which are of the same type, but which may have options or variations that require higher work content and longer operation times for at least one assembly workstation. A feasible production sequence is one that does not schedule vehicles with options in such a way that one or more workstations are overloaded. In variations of the problem, other constraints may apply. We describe and compare three approaches to the modeling and solution of this problem. The first uses integer programming to model and solve the problem. The second approaches the question as a constraint satisfaction problem (CSP). The third method proposes an adaptation of the Ant Colony Optimization for the car sequencing problem. Test-problems are drawn from CSPLib, a publicly available set of problems available through the Internet. We quote results drawn both from our own work and from other research. The literature review is not intended to be exhaustive but we have sought to include representative examples and the more recent work. Our conclusions bear on likely research avenues for the solution of problems of practical size and complexity. A new set of larger benchmark problems was generated and solved. These problems are available to other researchers who may wish to solve them using their own methods.

Journal ArticleDOI
TL;DR: Wagner models can solve problems containing larger numbers of machines and jobs than the Manne models, and hence are preferable for finding optimal solutions to the permutation flowshop problem with makespan objective.
Abstract: This paper investigates the performance of two families of mixed-integer linear programing (MILP) models for solving the regular permutation flowshop problem to minimize makespan. The three models of the Wagner family incorporate the assignment problem while the five members of the Manne family use pairs of dichotomous constraints, or their mathematical equivalents, to assign jobs to sequence positions. For both families, the problem size complexity and computational time required to optimally solve a common set of problems are investigated. In so doing, this paper extends the application of MILP approaches to larger problem sizes than those found in the existing literature. The Wagner models require more than twice the binary variables and more real variables than do the Manne models, while Manne models require more constraints for the same sized problems. All Wagner models require much less computational time than any of the Manne models for solving the common set of problems, and these differences increase dramatically with increasing number of jobs and machines. Wagner models can solve problems containing larger numbers of machines and jobs than the Manne models, and hence are preferable for finding optimal solutions to the permutation flowshop problem with makespan objective.

Journal ArticleDOI
TL;DR: It is illustrated that revenue sharing contract can optimize the chain and bring win–win situations to the players in the industry.
Abstract: Recent developments in video rental supply chains seem to indicate that revenue sharing contracts are beneficial to all parties involved in the industry. This paper illustrates a theoretical underpinning of the observed practice. A video rental supply chain is modelled to study pricing and replenishment decision making by the two autonomous firms in the chain, namely a movie studio producing the tapes and a video rental shop renting the tapes to customers. In the model the movie studio is to set the price for selling the tapes to the video rental store and the video rental shop must decide the number of copies of the new movie videotape it should purchase. The paper illustrates that revenue sharing contract can optimize the chain and bring win–win situations to the players in the industry.