scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Operational Research Society in 2000"


Journal ArticleDOI
TL;DR: This work proposes a new approach in which some or all of the coefficients of the LP are specified as intervals, and finds the best optimum and the worst optimum for the model, and the point settings of the interval coefficients that yield these two extremes.
Abstract: In order to solve a linear programme, the model coefficients must be fixed at specific values, which implies that the coefficients are perfectly accurate. In practice, however, the coefficients are generally estimates. The only way to deal with uncertain coefficients is to test the sensitivity of the model to changes in their values, either singly or in very small groups. We propose a new approach in which some or all of the coefficients of the LP are specified as intervals. We then find the best optimum and the worst optimum for the model, and the point settings of the interval coefficients that yield these two extremes. This provides the range of the optimised objective function, and the coefficient settings give some insight into the likelihood of these extremes.

323 citations


Journal ArticleDOI
TL;DR: The formulation and calibration of a system dynamics model of the interaction of demand pattern, A&E resource deployment, other hospital processes and bed numbers are discussed; and the outputs of policy analysis runs of the model which vary a number of the key parameters have policy implications.
Abstract: Accident and Emergency (AE and the outputs of policy analysis runs of the model which vary a number of the key parameters. Two significant findings have policy implications. One is that while some delays to patients are unavoidable, reductions can be achieved by selective augmentation of resources within, and relating to, the A&E unit. The second is that reductions in bed numbers do not increase waiting times for emergency admissions, their effect instead being to increase sharply the number of cancellations of admissions for elective surgery. This suggests that basing A&E policy solely on any single criterion will merely succeed in transferring the effects of a resource deficit to a different patient group.

305 citations


Journal ArticleDOI
TL;DR: This paper proposes that a particular account of the philosophy of science, known as ‘critical realism’, is especially suitable as an underpinning of OR/MS.
Abstract: Many issues are under debate as to the philosophical nature of OR/MS: is it science or technology? Is it natural or social science? Can it be realist as well as being interpretivist? There are also many debates within the philosophy of science itself. This paper proposes that a particular account of the philosophy of science, known as 'critical realism', is especially suitable as an underpinning of OR/MS. The structure of the argument of this paper is to outline the main positions within the philosophy of science and highlight their problems, especially from the point of view of OR/MS; then to introduce critical realism and to show how it addresses these problems and how it is particularly appropriate for OR/MS; and finally to illustrate this by considering examples of various practical OR methods.

275 citations


Journal ArticleDOI
TL;DR: The FPM is compared with the main existing prioritisation methods in order to evaluate its performance and it is shown that it possesses some attractive properties and could be used as an alternative to the known prioritisation Methods, especially when the preferences of the decision-maker are strongly inconsistent.
Abstract: The estimation of the priorities from pairwise comparison matrices is the major constituent of the Analytic Hierarchy Process (AHP). The priority vector can be derived from these matrices using different techniques, as the most commonly used are the Eigenvector Method (EVM) and the Logarithmic Least Squares Method (LLSM). In this paper a new Fuzzy Programming Method (FPM) is proposed, based on geometrical representation of the prioritisation process. This method transforms the prioritisation problem into a fuzzy programming problem that can easily be solved as a standard linear programme. The FPM is compared with the main existing prioritisation methods in order to evaluate its performance. It is shown that it possesses some attractive properties and could be used as an alternative to the known prioritisation methods, especially when the preferences of the decision-maker are strongly inconsistent.

238 citations


Journal ArticleDOI
TL;DR: A heuristic method for solving the open vehicle routing problem, based on a minimum spanning tree with penalties procedure, is presented and results are provided.
Abstract: The open vehicle routing problem (OVRP) differs from the classic vehicle routing problem (VRP) because the vehicles either are not required to return to the depot, or they have to return by revisiting the customers assigned to them in the reverse order. Therefore, the vehicle routes are not closed paths but open ones. A heuristic method for solving this new problem, based on a minimum spanning tree with penalties procedure, is presented. Computational results are provided.

203 citations


Journal ArticleDOI
TL;DR: A methodology for the automatic generation of computerised solutions to the container stowage problem is shown; objective functions that provide a basis for evaluating solutions are given in addition to the underlying structures and relationships that embody this problem.
Abstract: The container stowage problem concerns the suitable placement of containers in a container-ship on a multi-port journey; it requires consideration of the consequences each placement has on decisions at subsequent ports. A methodology for the automatic generation of computerised solutions to the container stowage problem is shown; objective functions that provide a basis for evaluating solutions are given in addition to the underlying structures and relationships that embody this problem. The methodology progressively refines the placement of containers within the cargo-space of a container ship until each container is specifically allocated to a stowage location. The methodology embodies a two stage process to computerised planning, that of a generalised placement strategy and a specialised placement procedure. Heuristic rules are built into objective functions for each stage that enable the combinatorial tree to be explored in an intelligent way, resulting in good, if not optimal, solutions for the problem in a reasonable processing time.

154 citations


Journal ArticleDOI
TL;DR: This paper illustrates how a modern heuristic and two classical integer programming models have been combined to provide a solution to a nurse rostering problem at a major UK hospital.
Abstract: This paper illustrates how a modern heuristic and two classical integer programming models have been combined to provide a solution to a nurse rostering problem at a major UK hospital. Neither a heuristic nor an exact approach based on a standard IP package was able to meet all the practical requirements. This was overcome by using a variant of tabu search as the core method, but applying knapsack and network flow models in pre and post-processing phases. The result is a successful software tool that frees senior nursing staff from a time consuming administrative task.

152 citations


Journal ArticleDOI
TL;DR: The variety of ways in which disruptions occur and the variety of consequences that may unfold are presented, and the role of dynamic feedback and the ‘portfolio effect’ is introduced, particularly with reference to project acceleration and changing productivity.
Abstract: The idea that small disruptions and delays can cause serious consequences to the life of a major project, well beyond that which might be easily attributed to their direct impact, is well established. Nevertheless, the nature of this 'delay and disruption' is still not fully understood. This paper discusses some of the issues and difficulties in gaining a full understanding. In particular it presents the variety of ways in which disruptions occur, and the variety of consequences that may unfold. It also focuses attention on a number of issues that arise when 'normal' methods of analysis of complex projects might be used, for example, the analysis and costing of change orders and the use of network analysis. The role of dynamic feedback and the 'portfolio effect' is introduced, particularly with reference to project acceleration and changing productivity.

152 citations


Journal ArticleDOI
TL;DR: A model is presented to advise at a monitoring check what maintenance action to take based upon the condition monitoring and preventive maintenance information obtained to date, relevant to a large class of condition monitoring techniques currently employed in industry including vibration and oil analysis.
Abstract: This paper considers a stochastic dynamic system subject to random deterioration, with regular condition monitoring and preventive maintenance. A model is presented to advise at a monitoring check what maintenance action to take based upon the condition monitoring and preventive maintenance information obtained to date. A general assumption adopted in the paper is that the performance of the system concerned can not be described directly by the monitored information, but is correlated with it stochastically. The model is relevant to a large class of condition monitoring techniques currently employed in industry including vibration and oil analysis. The model is constructed under fairly general conditions and includes two novel developments. Firstly, the concept of the conditional residual time is used to measure the condition of the monitored system at the time of a monitoring check, and secondly, contrary to previous practice, the monitored observation is now assumed to be a function of the system condition. Relationships between the observed history of condition monitoring, preventive maintenance actions, and the condition of the system are established. Methods for estimating model parameters are discussed. Since the model presented is generally beyond the scope for an analytical solution, a numerical approximation method is also proposed. Finally, a case example is presented to illustrate the modelling concepts in the case of non-maintained plant.

149 citations


Journal ArticleDOI
TL;DR: The findings of this study illustrate that mergers do increase a hospital's level of efficiency and indicates the role of scale efficiency as a dominant source of improvement in inefficiency of hospitals involved in horizontal mergers.
Abstract: Due to the wave of mergers that have taken place in the USA, the early 1990s could be labelled as a restructuring era for health care systems. The question of whether mergers have an impact on organizational performance is still an area of interest for health services researchers. In this study, we examined the impacts of horizontal mergers of US hospital's technical efficiency before and after merger using longitudinal Data Envelopment Analysis (DEA). The findings of our study illustrate that mergers do increase a hospital's level of efficiency. Constant returns-to-scale model indicated an overall reduction in input utilisation after merger, compared to variable returns-to-scale model. This indicates the role of scale efficiency as a dominant source of improvement in inefficiency of hospitals involved in horizontal mergers, but not for technical efficiency. Suggestions for future study are provided.

148 citations


Journal ArticleDOI
TL;DR: The contribution of Data Envelopment Analysis to inform management is explored and illustrated in an application to the University of Warwick, using concepts from a technique to support strategic option formulation, the Boston Consulting Group (BCG) matrix.
Abstract: This paper is focused on the process of performance measurement undertaken by different stakeholders in the UK higher education sector, focusing on the institutional perspective. Different classes of stakeholders have different motivations to measure performance. Institutions will be affected on the one hand by the state evaluation of them, and on the other by the applicant's. The contribution of Data Envelopment Analysis (DEA) to inform management is explored and illustrated in an application to the University of Warwick, using concepts from a technique to support strategic option formulation, the Boston Consulting Group (BCG) matrix.

Journal ArticleDOI
TL;DR: The role that formal modelling, both qualitative and quantitative, and the use of a group support system can play in developing strategic direction and the way in which patterns often express the distinctiveness of competencies is discussed.
Abstract: The paper discusses the role that formal modelling, both qualitative and quantitative, and the use of a group support system can play in developing strategic direction. In particular the paper focuses on the modelling of competencies as patterns and the way in which patterns often express the distinctiveness of competencies. The relationship between patterns of competencies and the goals of an organisation are explored as the basis for establishing core distinctive competencies and for developing and exploring the business model which will inform strategic direction. As an introduction the nature of strategic management is discussed, as it relates to the role of modelling competencies.

Journal ArticleDOI
TL;DR: The concept of a membership function used in fuzzy set theory for representing imprecise data is adopted and the smallest possible, most possible, and largest possible values of the missing data are derived from the observed data to construct a triangular membership function.
Abstract: In measuring the relative efficiencies of a set of decision making units (DMUs) via data envelopment analysis (DEA), detailed inputs and outputs are usually involved. However, there are cases where some DMUs are unable to provide all the necessary data. This paper adopts the concept of a membership function used in fuzzy set theory for representing imprecise data. The smallest possible, most possible, and largest possible values of the missing data are derived from the observed data to construct a triangular membership function. With the membership function, a fuzzy DEA model can be utilized to calculate the efficiency scores. Since the efficiency scores are fuzzy numbers, they are more informative than crisp efficiency scores calculated by assuming crisp values for the missing data. As an illustration, the efficiency scores of the 24 University libraries in Taiwan, with three missing values, are calculated to show the extent that the actual amount of resources and services provided by each University is away from the technically efficient amount of resources and services. This methodology can also be applied to calculate the relative efficiencies of the DMUs with imprecise linguistic data.

Journal ArticleDOI
TL;DR: A set partitioning approach consisting of two phases is proposed for the combined ship scheduling and allocation problem and optimal solutions are obtained on several cases of a real ship planning problem.
Abstract: We present a bulk ship scheduling problem that is a combined multi-ship pickup and delivery problem with time windows (m-PDPTW) and multi-allocation problem. In contrast to other ship scheduling problems found in the literature, each ship in the fleet is equipped with a flexible cargo hold that can be partitioned into several smaller holds in a given number of ways. Therefore, multiple products can be carried simultaneously by the same ship. The scheduling of the ships constitutes the m-PDPTW, while the partition of the ships' flexible cargo holds and the allocation of cargoes to the smaller holds make the multi-allocation problem. A set partitioning approach consisting of two phases is proposed for the combined ship scheduling and allocation problem. In the first phase, a number of candidate schedules (including allocation of cargoes to the ships' cargo holds) is generated for each ship. In the second phase, we minimise transportation costs by solving a set partitioning problem where the columns are the candidate schedules generated in phase one. The computational results show that the proposed approach works, and optimal solutions are obtained on several cases of a real ship planning problem.

Journal ArticleDOI
TL;DR: The aim has been to design a SDSS so that it provides an interactive evacuation simulator with dynamic graphics that allows for experimentation with policies by providing rapid feedback from the simulation.
Abstract: A prototype spatial decision support system (SDSS) has been designed for contingency planning for emergency evacuations which combines simulation techniques with spatial data handling and display capabilities of a geographical information system (GIS). It links together the topographical support and analysis provided by the GIS-ARC/INFO, with a simulation model designed to simulate the dynamics of an evacuation process in detail. Our aim has been to design a SDSS so that it provides an interactive evacuation simulator with dynamic graphics that allows for experimentation with policies by providing rapid feedback from the simulation. The idea is that emergency planners will be able to use the SDSS to experiment with emergency evacuation plans in order to plan for different contingencies. This paper concentrates on the issues involved in designing an effective integration link interface between the GIS and the simulation model when building a SDSS of this type.

Journal ArticleDOI
TL;DR: A case study involving two problems involving the understanding of customer retention patterns by classifying policy holders as likely to renew or terminate their policies is presented and a variety of techniques within the methodology of data mining are solved.
Abstract: The insurance industry is concerned with many problems of interest to the operational research community. This paper presents a case study involving two such problems and solves them using a variety of techniques within the methodology of data mining. The first of these problems is the understanding of customer retention patterns by classifying policy holders as likely to renew or terminate their policies. The second is better understanding claim patterns, and identifying types of policy holders who are more at risk. Each of these problems impacts on the decisions relating to premium pricing, which directly affects profitability. A data mining methodology is used which views the knowledge discovery process within an holistic framework utilising hypothesis testing, statistics, clustering, decision trees, and neural networks at various stages. The impacts of the case study on the insurance company are discussed.

Journal ArticleDOI
TL;DR: In this article, a so-called critical level policy is applied to ration the inventory among the two demand classes, where low priority demand is rejected in anticipation of future high-priority demand whenever the inventory level is at or below a prespecified critical level.
Abstract: Whenever demand for a single item can be categorised into classes of different priority, an inventory rationing policy should be considered. In this paper we analyse a continuous review (s, Q) model with lost sales and two demand classes. A so-called critical level policy is applied to ration the inventory among the two demand classes. With this policy, low-priority demand is rejected in anticipation of future high-priority demand whenever the inventory level is at or below a prespecified critical level. For Poisson demand and deterministic lead times, we present an exact formulation of the average inventory cost. A simple optimisation procedure is presented, and in a numerical study we compare the optimal rationing policy with a policy where no distinction between the demand classes is made. The benefit of the rationing policy is investigated for various cases and the results show that significant cost reductions can be obtained.

Journal ArticleDOI
TL;DR: This study applies stochastic programming modelling and solution techniques to planning problems for a consortium of oil companies and involves decisions in both space and time and careful revision of the original deterministic formulation of the DROP model.
Abstract: In this paper we apply stochastic programming modelling and solution techniques to planning problems for a consortium of oil companies. A multiperiod supply, transformation and distribution scheduling problem—the Depot and Refinery Optimization Problem (DROP)—is formulated for strategic or tactical level planning of the consortium's activities. This deterministic model is used as a basis for implementing a stochastic programming formulation with uncertainty in the product demands and spot supply costs (DROPS), whose solution process utilizes the deterministic equivalent linear programming problem. We employ our STOCHGEN general purpose stochastic problem generator to ‘recreate’ the decision (scenario) tree for the unfolding future as this deterministic equivalent. To project random demands for oil products at different spatial locations into the future and to generate random fluctuations in their future prices/costs a stochastic input data simulator is developed and calibrated to historical industry data. The models are written in the modelling language XPRESS-MP and solved by the XPRESS suite of linear programming solvers. From the viewpoint of implementation of large-scale stochastic programming models this study involves decisions in both space and time and careful revision of the original deterministic formulation. The first part of the paper treats the specification, generation and solution of the deterministic DROP model. The stochastic version of the model (DROPS) and its implementation are studied in detail in the second part and a number of related research questions and implications discussed.

Journal ArticleDOI
TL;DR: This paper forms and solves the decoupling point location problem in supply chains as a total relevant cost (sum of inventory carrying cost and the delay costs) minimisation problem and compares the performance of two production planning and control policies in terms of total cost.
Abstract: In this paper, we investigated a dynamic modelling technique for analysing supply chain networks using generalised stochastic Petri nets (GSPNs). The customer order arrival process is assumed to be Poisson and the service processes at the various facilities of the supply chain are assumed to be exponential. Our model takes into account both the procurement process and delivery logistics that exist between any two members of the supply chain. We compare the performance of two production planning and control policies, the make-to-stock and the assemble-to-order systems in terms of total cost which is the sum of inventory carrying cost and cost incurred due to delayed deliveries. We formulate and solve the decoupling point location problem in supply chains as a total relevant cost (sum of inventory carrying cost and the delay costs) minimisation problem. We use the framework of integrated GSPN-queuing network modelling-with the GSPN at the higher level and a generalised queuing network at the lower level-to solve the decoupling point location problem.

Journal ArticleDOI
TL;DR: A new decision support software is presented—VIP analysis—which incorporates approaches belonging to different classes and proposes a methodology of analysis based on the progressive reduction of the number of alternatives, introducing a concept of tolerance that lets the decision makers use some of the approaches in a more flexible manner.
Abstract: We consider the aggregation of multicriteria performances by means of an additive value function under imprecise information. The problem addressed here is the way an analysis may be conducted when the decision makers are not able to (or do not wish to) fix precise values for the importance parameters. These parameters can be seen as interdependent variables that may take several values subject to constraints. Firstly, we briefly classify some existing approaches to deal with this problem. We argue that they complement each other, each one having its merits and shortcomings. Then, we present a new decision support software—VIP analysis—which incorporates approaches belonging to different classes. It proposes a methodology of analysis based on the progressive reduction of the number of alternatives, introducing a concept of tolerance that lets the decision makers use some of the approaches in a more flexible manner.

Journal ArticleDOI
TL;DR: This paper deals with a multi-item newsvendor problem subject to a budget constraint on the total value of the replenishment quantities and develops simple and efficient heuristic algorithms based on a set of test problems.
Abstract: This paper deals with a multi-item newsvendor problem subject to a budget constraint on the total value of the replenishment quantities. Fixed costs for non-zero replenishments have been explicitly considered. Dynamic programming procedures are presented for two situations: (i) where the end item demand distributions are assumed known (illustrated for the case of normally distributed demand) and (ii) a distribution free approach where only the first two moments of the distributions are assumed known. In addition, simple and efficient heuristic algorithms have been developed. Computational experiments show that the performance of the heuristics are excellent based on a set of test problems.

Journal ArticleDOI
TL;DR: It is argued that operational research is well fitted to handle strategic issues as the modelling approach of OR facilitates understanding and learning, and the evaluation of strategies prior to action.
Abstract: This paper explores the nature of operational research and its interactions with performance measurement and strategy. It is argued that operational research (OR) is well fitted to handle strategic issues as the modelling approach of OR facilitates understanding and learning, and the evaluation of strategies prior to action. The development of problem structuring methods is also a key aid to strategy and policy formulation. OR is also beginning to play a role in performance measurement and there is an opportunity for OR to lead in the improvement of performance measurement systems.

Journal ArticleDOI
TL;DR: In this article, the authors examined the performance of two different inventory models, namely a simple and an advanced model, for spare parts in a production plant of a confectionery producer in the Netherlands.
Abstract: This paper examines the performance of two different (s, Q) inventory models, namely a simple and an advanced model, for spare parts in a production plant of a confectionery producer in the Netherlands. The simple approach is more or less standard: the undershoot of the reorder level is not taken into account and the normal distribution is used as the distribution of demand during lead-time. The advanced model takes undershoots into account, differentiates between zero and nonzero demands during lead-time, and utilises the gamma distribution for the demand distribution. Both models are fed with parameters estimated by a procedure that forecasts demand sizes and time between demand occurrences separately (intermittent demand). The results show that the advanced approach yields a service level close to the desired one under many circumstances, while the simple approach is not consistent, in that it leads to much larger inventories in meeting the desired service level for all spare parts.

Journal ArticleDOI
TL;DR: A mathematical model is developed for ELSP taking into account the effect of imperfect quality and process restoration, and numerical examples are presented to illustrate important issues related to the developed model.
Abstract: In this paper, we model the effects of imperfect production processes on the economic lot scheduling problem (ELSP). It is assumed that the production facility starts in the in-control state producing items of high or perfect quality. However the facility may deteriorate with time and shifts at a random time to an out of control state and begins to produce nonconforming items. A mathematical model is developed for ELSP taking into account the effect of imperfect quality and process restoration. Numerical examples are presented to illustrate important issues related to the developed model.

Journal ArticleDOI
TL;DR: A method is suggested for predicting the distribution of scores in international soccer matches, treating each team’s goals scored as independent Poisson variables dependent on the Fédération Intemationale de Football Association rating of each team, and the match venue.
Abstract: In this paper a method is suggested for predicting the distribution of scores in international soccer matches, treating each team’s goals scored as independent Poisson variables dependent o...

Journal ArticleDOI
TL;DR: It was found that religious orientation, parental influence and level of exclusions all impacted on the ability of a school to deliver the best possible results in standard assessment tests.
Abstract: Data for this paper was collected from the OFSTED database on Hampshire primary schools. The schools in Southampton and Porstmouth were used in order to assess the factors that influence their productive efficiency. The data set included 19 variables on 176 schools and was analysed by means of Data Envelopment Analysis. Contextual variables, not included in the efficiency analysis, were used to explain the sources of inefficiency. It was found that religious orientation, parental influence and level of exclusions all impacted on the ability of a school to deliver the best possible results in standard assessment tests. This study is set within local and national priorities in education.

Journal ArticleDOI
TL;DR: The modelling of condition monitoring information for three critical water pumps at a large soft-drinks manufacturing plant is described to predict the distribution of the residual lifetimes of the individual pumps.
Abstract: In this paper the modelling of condition monitoring information for three critical water pumps at a large soft-drinks manufacturing plant is described. The purpose of the model is to predict the distribution of the residual lifetimes of the individual pumps. This information is used to aid maintenance management decision-making, principally relating to overhaul. We describe a simple decision rule to determine whether maintenance action is necessary given monitoring information to date.

Journal ArticleDOI
TL;DR: A methodology that combines several multi-criteria methods to address electricity planning problems within a realistic context is proposed and an efficient social compromise between these conflicting objectives is obtained.
Abstract: Growing social concern about the environmental impact of economic development has drawn attention to the need to integrate environmental criteria into energy decision-making problems. This has made electricity planning issues more complex given the multiplicity of objectives and decision-makers involved in the decision making process. This paper proposes a methodology that combines several multi-criteria methods to address electricity planning problems within a realistic context. The method is applied to an electricity planning exercise in Spain with a planning horizon set for the year 2030. The model includes the following objectives: (1) total cost; (2) C02; (3) SO2; and (4) NO x emissions as well as the amount of radioactive waste produced. An efficient social compromise between these conflicting objectives is obtained, which shows the advantages of using this model for policy-making purposes.

Journal ArticleDOI
TL;DR: This model generates a reliability analysis for each train type, line and platform, and is used to explore some policy issues, and to show how punctuality and reliability are affected by changes in the distributions of exogenous delays.
Abstract: On busy congested rail networks, randomdelays of trains are prevalent, and these delays have knock-on effects which result in a significant or substantial proportion of scheduled services being delayed or rescheduled. Here we develop and experiment with a simulation model to predict the probability distributions of these knock-on delays at stations, when faced with typical patterns of on-the-day exogenous delays. These methods can be used to test and compare the reliability of proposed schedules, or schedule changes, before adopting them. They can also be used to explore how schedule reliability may be affected by proposed changes in operating policies, for example, changes in minimum headways or dwell times, or changes in the infrastructure such as, layout of lines, platforms or signals. This model generates a reliability analysis for each train type, line and platform. We can also use the model to explore some policy issues, and to show how punctuality and reliability are affected by changes in the distributions of exogenous delays.

Journal ArticleDOI
TL;DR: A two-stage stochastic programming with recourse model for the problem of determining optimal planting plans for a vegetable crop is presented and solutions are obtained for a range of risk aversion factors that not only result in greater expected profit compared to the corresponding deterministic model, but also are more robust.
Abstract: A two-stage stochastic programming with recourse model for the problem of determining optimal planting plans for a vegetable crop is presented in this paper. Uncertainty caused by factors such as weather on yields is a major influence on many systems arising in horticulture. Traditional linear programming models are generally unsatisfactory in dealing with the uncertainty and produce solutions that are considered to involve an unacceptable level of risk. The first stage of the model relates to finding a planting plan which is common to all scenarios and the second stage is concerned with deriving a harvesting schedule for each scenario. Solutions are obtained for a range of risk aversion factors that not only result in greater expected profit compared to the corresponding deterministic model, but also are more robust.