scispace - formally typeset
Search or ask a question

Showing papers in "Infor in 2011"


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: The proposed algorithm is able to solve instances with 234 ports, 16,278 demands over 9 time periods in 34 min, and the integer solutions found by rounding down are computed in less than 5 s and the gap is within 0.01% from the upper bound of the linear relaxation.
Abstract: This paper is concerned with the cargo allocation problem considering empty repositioning of containers for a liner shipping company. The aim is to maximize the profit of transported cargo in a network, subject to the cost and availability of empty containers. The formulation is a multi-commodity flow problem with additional inter-balancing constraints to control repositioning of empty containers. In a study of the cost efficiency of the global container-shipping network, Song et al. (2005) estimate that empty repositioning cost constitutes 27% of the total world fleet running cost. An arc-flow formulation is decomposed using the Dantzig–Wolfe principle to a path-flow formulation. A linear relaxation is solved with a delayed column generation algorithm. A feasible integer solution is found by rounding the fractional solution and adjusting flow balance constraints with leased containers. Computational results are reported for seven instances based on real-life shipping networks. Solving the relaxe...

84 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: The basic MIP model is extended into a robust optimization model in order to account for the inherent uncertainty of the problem and can assist companies with varying degrees of risk tolerance in deciding the sale, purchase, chartering, lay-up, and scrapping of ships.
Abstract: Fleet sizing and deployment is a central decision problem of bulk shipping organizations. However, business and shipping market cycles induce significant asset price and transport demand volatility, making strategic fleet sizing extremely challenging. We propose a mixed integer programming (MIP) model of the multi-period fleet sizing and deployment problem. We extend the basic MIP model into a robust optimization model in order to account for the inherent uncertainty of the problem. Our approach can assist companies with varying degrees of risk tolerance in deciding the sale, purchase, chartering, lay-up, and scrapping of ships, as well as the deployment of the active ships to contracts and geographic markets. A realistic case study is presented.

51 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: In this article, the authors present an exact solution method for an important planning problem faced by many shipping companies dealing with transportation of bulk cargoes, which is denoted as a maritime pickup and delivery problem with time windows and split loads (PDPTWSL).
Abstract: The purpose of this paper is to present an exact solution method for an important planning problem faced by many shipping companies dealing with transportation of bulk cargoes. A bulk shipping company usually has a set of contract cargoes that it is committed to carry and will try to derive additional revenue from optional spot cargoes. Each cargo, either it is a contract or spot cargo, consists of a given quantity to be picked up in a given loading port and delivered in a given unloading port within specified time windows. The shipping company controls a fixed fleet for the purpose of transporting the cargoes. In most of the literature on ship routing and scheduling problems, a cargo cannot be transported by more than one ship. By introducing split loads, this restriction is removed and each cargo can be transported by several ships. The resulting planning problem can be denoted as a maritime pickup and delivery problem with time windows and split loads (PDPTWSL). We present an arc flow formulat...

46 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: A classification scheme for ship routing and scheduling problems in liner shipping in line with the current and future operational conditions of the liner shipping industry is provided.
Abstract: This article provides a classification scheme for ship routing and scheduling problems in liner shipping in line with the current and future operational conditions of the liner shipping industry. Based on the classification, the literature is divided into groups whose main characteristics are described. The literature within each group is reviewed, much of it for the first time.

41 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: The relocation model is shown to perform better than the static location model by as much as 20–30% when using fire weather data to forecast short term future demand for severe fires, whereas relocating without rolling horizon forecasting can be less cost-effective than astatic location model.
Abstract: A location and relocation model are proposed for air tanker initial attack basing in California for regional wildland fires that require multiple air tankers that may be co-located at the same air base. The Burning Index from the National Fire Danger Rating System is modeled as a discrete mean-reverting process and estimated from 2001–2006 data for select weather stations at each of 12 California Department of Forestry's units being studied. The standard p-median formulation is changed into a k-server p-median problem to assign multiple servers to a node. Furthermore, this static problem is extended into the time dimension to obtain a chance-constrained dynamic relocation problem. Both problems are solved using branch and bound in the numerical example. The relocation model is shown to perform better than the static location model by as much as 20–30% when using fire weather data to forecast short term future demand for severe fires, whereas relocating without rolling horizon forecasting can be l...

36 citations


Journal ArticleDOI
01 Nov 2011-Infor
TL;DR: This paper proposes a method, based on Ward's hierarchical cluster method, to obtain surgery types that minimize the weighted sum of the dummy surgery volume and the variability in resource demand of surgery types, which can be used in master surgical scheduling.
Abstract: Master surgical scheduling can improve the manageability and efficiency of operating room departments. This approach cyclically executes a master surgical schedule of surgery types. Surgery types need to be constructed with low variability to be efficient. Each surgery type is scheduled based upon its frequency per cycle. Surgery types that cannot be scheduled repetitively are put together in so-called dummy surgeries. Narrowly defined surgery types, with low variability, lead to many of such dummy surgeries, which reduces the benefits of a master surgical scheduling approach. In this paper we propose a method, based on Ward's hierarchical cluster method, to obtain surgery types that minimize the weighted sum of the dummy surgery volume and the variability in resource demand of surgery types. The resulting surgery types (clusters) are thus based on logical features and can be used in master surgical scheduling. The approach is successfully tested on a case study in a regional hospital.

23 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: This paper illustrates how operations research can be applied to large-scale maritime crude oil transportation as the crude oil tanker routing and scheduling problem (COTRASP), a prevalent problem in the petroleum and shipping industry.
Abstract: This paper illustrates how operations research can be applied to large-scale maritime crude oil transportation. The more the industry strives to improve the efficiency of their supply chains the mo...

19 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: The state dependent models included within this paper extend the range of possible system applications with the algorithm to include transportation and material handling conveyor systems, pedestrian, and vehicular routing applications.
Abstract: Optimal routing in closed queueing networks with state dependent queues is the focus of this paper. We seek a methodology and algorithm to efficiently solve the routing probabilities in closed systems. These systems may include multi-server exponential and general service finite capacity state dependent models as well as multi-chain systems. The state dependent models included within this paper extend the range of possible system applications with our algorithm to include transportation and material handling conveyor systems, pedestrian, and vehicular routing applications. Sometimes the networks will be purely state dependent M/G/c/c queues, while other times, there will be a mixture of M/M/c and M/G/c/c queues. These state dependent M/G/c/c queues are finite queues, while the workstations are infinite buffer queues. Because these networks have been shown to have a product form and the objective function in single chain networks is concave, we can achieve an optimal seeking algorithm. Numerous ex...

16 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: A path flow model and solution method for a maritime pulp distribution problem that incorporates the distribution planning from the ports to the customers as well as direct deliveries from the pulp mills is presented.
Abstract: In this paper, a path flow model and solution method for a maritime pulp distribution problem is presented and analyzed. Unlike many other models for maritime distribution chains, the customers are not located at the ports, but different modes of transportation are needed in order to deliver the pulp. This means that the model proposed also incorporates the distribution planning from the ports to the customers as well as direct deliveries from the pulp mills. For the distribution, a fleet of long-term time-chartered ships are used, but additional ships can also be chartered on the spot market. The problem is modeled as a mixed integer linear program and solved using a branch-and-price method. Due to the complexity of the problem, the solution strategy is divided into two phases, where the first emphasizes the generation of schedules for the time-chartered ships while the second deals with decisions regarding the chartering of ships on the spot market. To generate the schedules, a network based on clusters is constructed, and a modified k-shortest path algorithm is developed to solve the problem. The algorithm penalizes parts of schedules already generated, making the generated schedules more diversified than general k-shortest path algorithms.

15 citations


Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: Algorithms developed for the search resources allocation problem for aeronautical SAR incidents when multiple indivisible searchers are present are presented, based on classical search theory and on constraint programming.
Abstract: Search and Rescue (SAR) comprises the search for and provision of aid to persons who are, or who are feared to be, in distress or in imminent danger of loss of life. Time is a crucial factor for survivors who must be found quickly and search planning may get complex in the case of a large search area and multiple search resources. The problem we address in this paper is that of defining and assigning multiple non-overlapping rectangular sub-areas to search units (search aircraft) such that the search plan is operationally feasible and the total probability of success is maximized. We present algorithms we developed for the search resources allocation problem for aeronautical SAR incidents when multiple indivisible searchers are present. These algorithms are based on classical search theory and on constraint programming. We assume that the search effort is continuous and measured by track length, that the search object is stationary and that search is conducted in discrete space. We present experi...

13 citations


Journal ArticleDOI
01 Aug 2011-Infor
TL;DR: This paper characterize the structures of the optimal designs and compare with the deterministic counterparts to understand what constitutes good robust network designs and can later be used to develop better heuristics than those available today.
Abstract: This paper examines the single-commodity network design problem with stochastic demand and multiple sources and sinks. We characterize the structures of the optimal designs and compare with the deterministic counterparts. We do this primarily to understand what constitutes good robust network designs. This can later be used to develop better heuristics than those available today.

Journal ArticleDOI
01 Aug 2011-Infor
TL;DR: The mathematical programming model which formed the basis of a software tool that was developed to assist security planners in personnel scheduling provides a novel mathematical formulation for the technique of applying integer programming to scheduling problems, in the context of an important practical application.
Abstract: The Vancouver 2010 Integrated Security Unit (V2010-ISU) ensured security during the Vancouver 2010 Olympic Games. Over six thousand Royal Canadian Mounted Police (RCMP) officers provided round-the-clock security for 30 venues and 27 functions. The V2010-ISU needed to develop shift schedules for the RCMP officers so that not only were hourly security requirements met, but work shifts needed to satisfy a variety of scheduling constraints (shift lengths, start times, rest periods, etc.). As the number of personnel that were required for each hour at each venue was anticipated to change, V2010-ISU planners required an automated means of generating efficient schedules quickly. This paper details the mathematical programming model which formed the basis of a software tool that was developed to assist security planners in personnel scheduling. It provides a novel mathematical formulation for the technique of applying integer programming to scheduling problems, in the context of an important practical ap...

Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: The world of maritime transportation is full of important, interesting and challenging operations research (OR) problems including ship routing and scheduling, maritime inventory routing, handling of empty containers, fleet renewal, port or terminal management and cargo stowage onboard ships to mention a few.
Abstract: (2011). Some Thoughts on Research Directions for the Future: Introduction to the Special Issue in Maritime Transportation. INFOR: Information Systems and Operational Research: Vol. 49, Special Issue in “Maritime Transportation”, pp. 75-77.

Journal ArticleDOI
01 Nov 2011-Infor
TL;DR: An optimization model in which the manufacturer maximizes its expected profit by choosing marketing efforts to promote different uses is developed, which drives a number of interesting managerial insights on how to control off-label uses by applying operations research methods to address a health care policy issue.
Abstract: Third-party payers often reimburse drugs that are listed on formularies. Formularies list drugs that have been clinically proved to be safe and effective and have been approved for certain uses by a regulatory authority, such as the U.S. Food and Drug Administration. However, once a drug is approved, physicians may also prescribe it for unapproved or “off-label” indications. In addition, although third-party payers may specify some of labelled uses for reimbursement, prescriptions may leak to unspecified but labelled indications. Once a drug is listed on a formulary, the payer faces unlimited liability for that drug. Drug manufacturers thus try to get their drugs listed on a formulary and promote sales for both labelled and off-label uses. Some third-party payers use price-volume agreements to control unspecified drug uses. This paper investigates how a manufacturer would make marketing decisions under a price-volume agreement. We develop an optimization model in which the manufacturer maximizes ...

Journal ArticleDOI
01 Aug 2011-Infor
TL;DR: An optimal algorithm is developed to solve the Weber location problem and the weights of the Weber problem are drawn form a multivariate distribution.
Abstract: A new objective for the Weber location problem is proposed. The weights of the Weber problem are drawn form a multivariate distribution. The objective is to minimize the probability of over-running a cost threshold. Alternatively, we may wish to minimize the threshold for a given probability. These concepts can be applied to many optimization models as well. We analyze the problem and develop an optimal algorithm to solve it. An illustrative example is solved and computational results for randomly generated problems are presented.

Journal ArticleDOI
01 Nov 2011-Infor
TL;DR: The impact of using incentives to encourage optimal allocation in a two-level decision-making process for HIV/AIDS prevention funds maximizes the number of HIV infections averted is evaluated.
Abstract: HIV/AIDS prevention funds are often allocated at multiple levels of decision-making. Optimal allocation of HIV prevention funds maximizes the number of HIV infections averted. However, decision makers often allocate using simple heuristics such as proportional allocation. We evaluate the impact of using incentives to encourage optimal allocation in a two-level decision-making process. We model an incentive based decision-making process consisting of an upper-level decision maker allocating funds to a single lower-level decision maker who then distributes funds to local programs. We assume that the lower-level utility function is linear in the amount of the budget received from the upper-level, the fraction of funds reserved for proportional allocation, and the number of infections averted. We assume that the upper level objective is to maximize the number of infections averted. We illustrate with an example using data from California, U.S.

Journal ArticleDOI
01 Aug 2011-Infor
TL;DR: It is demonstrated how the fuzzy inference process produces an efficiency profile of a producing unit, which can serve as the output of the analysis or which can be “defuzzified” to produce a specific efficiency score.
Abstract: We apply fuzzy logic to measure the efficiency of a unit that consumes multiple inputs to produce multiple outputs. We demonstrate how the fuzzy inference process produces an efficiency profile of a producing unit, which can serve as the output of the analysis or which can be “defuzzified” to produce a specific efficiency score. The approach, which we call FuzzEA for Fuzzy Efficiency Analysis, allows the analyst to measure the efficiency of an individual unit without collecting detailed data for all comparable units. The approach allows the analyst, in conjunction with the context expert, to incorporate industry-specific experiential knowledge concerning relevant ratios of inputs and outputs, information often known as benchmarks, or “rules of thumb.” Therefore, the analyst may prefer the FuzzEA approach when detailed data on all producing units are either unavailable or cumbersome to collect, or when the analyst requires efficiency results for only one or a small number of units. Finally, contex...

Journal ArticleDOI
01 Nov 2011-Infor
TL;DR: The objective of the Toronto meeting was to bring together experts in operational research and its application in health care to highlight common issues and develop new methodological approaches to questions of health care policy, planning, and development.
Abstract: ORAHS is the EURO Working Group on Operational Research Applied to Health Services. It was formed in 1975 as a special interest group in EURO (the Association of European Operational Research Societies), providing a network for researchers involved in the application of systematic and quantitative analysis to support planning and management in the health services sector. The group has more than 178 members from 23 countries around the world. Traditionally the group meets every summer, for one week, in a different European host country. The 34th annual meeting of ORAHS was held in Toronto, Ontario from July 24-August 1, 2008 at the University of Toronto. Features of the Toronto meeting included a trip to Niagara Falls, a doctoral student colloquium, as well as a detailed program of scientific sessions. More than 160 persons from 16 countries attended the conference, including more than 30 Canadians. A total of 110 talks were presented, including the CORS 50th Anniversary Lecture, which was presented by Dr. Bill Pierskalla from UCLA on July 28, 2008. Keynote talks were presented by Dr. Jeff Griffiths from the University of Cardiff, and Dr. Michael Rachlis of the University of Toronto. The Toronto meeting marked the first time an ORAHS meeting has been held in North America and only the second time that a meeting has been held outside of Europe. The theme of the Toronto meeting was “International Perspectives on Operations Research and Health Care.” Health care is an expensive business – most industrial economies spend more than 10% of their gross domestic product on the provision of health care. The efficient and effective organization and deployment of health care resources is an important issue for governments and patients alike. Operational research offers a unique perspective and set of tools for managing and optimizing this important and expensive societal resource. The objective of the Toronto meeting was to bring together experts in operational research and its application in health care to highlight common issues and develop new methodological approaches to questions of health care policy, planning, and development. In December 2009 the conference proceedings were published. A total of 13 papers, covering applications ranging from transportation of supplies in a multi-site hospital corporation to home care planning and forecasting, and methods ranging from integer programming to lean methodologies were accepted for publication. We have selected four of the papers from the proceedings for this special issue of INFOR. The papers selected, we feel, highlight both the broad scope of OR models in health care, as well as the depth of analysis and rigour of methodological development, which makes health care a fascinating venue for research. We would like to thank everyone who attended ORAHS 34 for sharing their work and their ideas. We would like, in particular, to thank those authors who contributed towards the creation of a wonderful conference proceedings and congratulate the authors of the four papers appearing in this special issue. Finally, special thanks are due to the members of the organizing committee that made ORAHS 2008 happen: Sonia Vanderby, Dionne Aleman, Daphne Sniekers, Ali Esensoy and Andriy Kolos. In addition, we would also like to thank the members of the international program committee: Sally Brailsford (United Kingdom), Murray Côté (United States), Erwin Hans (Netherlands), Paul Harper (United Kingdom), Stefan Nickel (Germany), and Marion Rauner (Austria) for their help and support.

Journal ArticleDOI
01 Nov 2011-Infor
TL;DR: This work will present and discuss in detail the difficulties presented by TMI treatment planning in conjunction with IMRT, as well as the difficulties posed by non-coplanar BOO.
Abstract: As part of the conditioning process to prepare the patient for the bone marrow transplant, the patient is treated with total marrow irradiation (TMI). The purpose of TMI is to eliminate the underlying disease and to suppress the recipient's immune systems, thus preventing rejection of new donor stem cells. Designing a treatment plan for TMI poses unique challenges that are not present in other forms of site-specific radiation therapy, for example, head-and-neck and prostate treatments. Specifically, the large site to be treated results in clinical treatments where the patient must be positioned far from the isocenter, as well as often repositioned during treatment, thus increasing uncertainty in delivered dose. Designing TMI treatments with intensity modulated radiation therapy (IMRT) will provide more accurate treatments that can spare healthy tissues while simultaneously delivering the prescribed radiation dose to the bones. To bring the patient closer to isocenter, beam orientation optimizatio...

Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: This paper addresses two current era problems of information technology—spam and virus recognition and is based on the same mathematical model, the so-called dichotomous choice model.
Abstract: Borrowing from the vote aggregation literature, this paper addresses two current era problems of information technology – spam and virus recognition. The underlying idea behind the solutions proposed is to combine the opinions of several “experts” in one of these areas (i.e., anti-spam filters, anti-virus programs, respectively) to a final group decision. Despite the fact that the problems, and thus the applications designed to tackle them, are different, our approach in these solutions is unified, and is based on the same mathematical model, the so-called dichotomous choice model.

Journal ArticleDOI
29 Sep 2011-Infor
TL;DR: This paper aims to show that a related data envelopment analysis (DEA) called the “congestion model” can offer a more precise picture of identifying CTAs suffering from congestion, and shows that the probability of experiencing congestion increases with the size, minimum purchase requirements, and the incentive fees a CTA operates.
Abstract: Congestion is often used in the operations area to investigate the excessive effect of inputs on outputs. In finance, and more specifically in the derivatives area, leverage is embedded in options and futures contracts. Commodity Trading Advisors (CTAs) use leverage (margin-to-equity ratio) to magnify returns through the use of these futures contracts. However, excessive leverage may hamper performance. This paper aims to show that a related data envelopment analysis (DEA) called the “congestion model” can offer a more precise picture of identifying CTAs suffering from congestion. In other words, if congestion is present then a reduction in input(s) may generate an increase in output. However, the opposite effect can arise. Although traditional DEA does an excellent job at ranking efficient CTAs, congestion on the other hand sizes up which CTAs are using too much (overuse) of each input, thereby reducing their performance/compound return (output). We measure the congestion of the largest (in term...

Journal ArticleDOI
01 Aug 2011-Infor
TL;DR: The idea is to obtain the recursive expressions based on the properties of continued fractions and the transient solutions of systems that leads to computational algorithms for the probability of the number of retrials made by a blocked primary customer.
Abstract: Customer retrials are very common phenomena in industrial engineering and business management. The number of retrials required before receiving a success is an important measure for evaluating system performance. Focusing on the M/M/1 and M/M/2 retrial queues, we study the conditional probability of a successful retrial given that all previous retrials have been denied. The idea is to obtain the recursive expressions based on the properties of continued fractions and the transient solutions of systems. The result leads to computational algorithms for the probability of the number of retrials made by a blocked primary customer. Through implementation of the proposed algorithms, numerical examples and analysis are presented to show computational efficiency and properties of the conditional probability.