scispace - formally typeset
Search or ask a question

Showing papers in "Annals of Operations Research in 1996"


Journal ArticleDOI
TL;DR: This bibliography provides a classification of a comprehensive list of 1380 references on the theory and application of metaheuristics that have had widespread successes in attacking a variety of difficult combinatorial optimization problems that arise in many practical areas.
Abstract: Metaheuristics are the most exciting development in approximate optimization techniques of the last two decades. They have had widespread successes in attacking a variety of difficult combinatorial optimization problems that arise in many practical areas. This bibliography provides a classification of a comprehensive list of 1380 references on the theory and application of metaheuristics. Metaheuristics include but are not limited to constraint logic programming; greedy random adaptive search procedures; natural evolutionary computation; neural networks; non-monotonic search strategies; space-search methods; simulated annealing; tabu search; threshold algorithms and their hybrids. References are presented in alphabetical order under a number of subheadings.

646 citations


Journal ArticleDOI
TL;DR: A simple genetic algorithm is introduced, and various extensions are presented to solve the traveling salesman problem, using randomized search techniques that simulate some of the processes observed in natural evolution.
Abstract: This paper is a survey of genetic algorithms for the traveling salesman problem. Genetic algorithms are randomized search techniques that simulate some of the processes observed in natural evolution. In this paper, a simple genetic algorithm is introduced, and various extensions are presented to solve the traveling salesman problem. Computational results are also reported for both random and classical problems taken from the operations research literature.

571 citations


Journal ArticleDOI
TL;DR: This paper develops simulated annealing metaheuristics for the vehicle routing and scheduling problem with time window constraints using the λ-interchange mechanism of Osman and thek-node interchange process of Christofides and Beasley.
Abstract: This paper develops simulated annealing metaheuristics for the vehicle routing and scheduling problem with time window constraints. Two different neighborhood structures, the λ-interchange mechanism of Osman and thek-node interchange process of Christofides and Beasley, are implemented. The enhancement of the annealing process with a short-term memory function via a tabu list is examined as a basis for improving the metaheuristic approach. Computational results on test problems from the literature as well as large-scale real-world problem are reported. The metaheuristics achieve solutions that compare favorably with previously reported results.

344 citations


Journal ArticleDOI
TL;DR: It is illustrated that genetic operators can fulfill long-term strategic functions for a tabu search implementation that is chiefly founded on short-term memory strategies.
Abstract: Some genetic algorithms are considered for the graph coloring problem. As is the case for other combinatorial optimization problems, pure genetic algorithms are outperformed by neighborhood search heuristic procedures such as tabu search. Nevertheless, we examine the performance of several hybrid schemes that can obtain solutions of excellent quality. For some graphs, we illustrate that genetic operators can fulfill long-term strategic functions for a tabu search implementation that is chiefly founded on short-term memory strategies.

330 citations


Journal ArticleDOI
TL;DR: Seven theorems are presented which expand understanding of the theoretical structure of the Charnes-Cooper-Rhodes (CCR) model of Data Envelopment Analysis, especially with respect to slacks and the underlying structure of facets and faces, and are a basis for new algorithms which will provide optimal primal and dual solutions.
Abstract: This paper presents seven theorems which expand understanding of the theoretical structure of the Charnes-Cooper-Rhodes (CCR) model of Data Envelopment Analysis, especially with respect to slacks and the underlying structure of facets and faces. These theorems also serve as a basis for new algorithms which will provide optimal primal and dual solutions that satisfy the strong complementary slackness conditions (SCSC) for many (if not most) non-radially efficient DMUs; an improved procedure for identifying the setE of extreme efficient DMUs; and may, for many DEA domains, also settle in a single pass the existence or non-existence of input or output slacks in each of their DMUs. This paper also introduces the concept of a positivegoal vector G, which is applied to characterize the set of all possible maximal optimal slack vectors. The appendix C presents an example which illustrates the need for a new concept,face regular, which focuses on the role of convexity in the intersections of radial efficient facets with the efficient frontier FR. The same example also illustrates flaws in the popular “sum of the slacks” methodology.

262 citations


Journal ArticleDOI
TL;DR: This special volume of the Annals of Operations Research contains revisedversions of a selection of the papers that were presented at the 9th International Conference on the Practice and Theory of Automated Timetabling (PATAT) in Belfast between 10th and 13th August 2010.
Abstract: This special volumecomprises revisedversions of a selectionof the papers thatwere presented at the 9th International Conference on the Practice and Theory of Automated Timetabling (PATAT) in Belfast between 10th and 13th August 2010. The PATAT conferences are held biennially and this is the second time that the Annals of Operations Research has provided the venue for such a special collection of papers. The first PATAT special volume of this journal (Volume 194) contains papers associated with the 8th conference which was held in Montreal in 2008. PATAT acts as an international forum for all aspects of timetabling, including educational timetabling, personnel rostering, sports timetabling, and transport scheduling. The conference series is particularly concerned both with closing the gap between timetabling theory and practice and with supporting multidisciplinary interactions. The collection of papers in this special volume reflect these aims. The conference in Belfast brought together approximately 100 participants from around the world. There were five plenary presentations, 74 standard talks, and 16 practitioner presentations. All the delegates were invited to submit their revised papers to this special volume. The papers have been through a rigorous and thorough review process, and we are delighted to be able to present the community with such an interesting and diverse selection of articles that reflect the latest thinking in timetabling research. We would like to take this opportunity to thank all those who were responsible for the success of the conference. We would particularly like to thank Brian Fleming and all those within the School of Electronics, Electrical Engineering andComputer Science at theQueen’s University of Belfast who worked so hard before and during the conference.

242 citations


Journal ArticleDOI
TL;DR: A revision and a generalization of the results contained in the only paper so far published on the matter of translation invariance is undertaken by allowing inputs and outputs to take not only zero but negative values, broadening the field of application of the DEA methodology.
Abstract: In this paper, we undertake a revision and a generalization of the results contained in the only paper so far published on the matter of translation invariance by allowing inputs and outputs to take not only zero but negative values. This broadens the field of application of the DEA methodology.

231 citations


Journal ArticleDOI
TL;DR: Previously used models, such as those used to identify “allocative inefficiencies”, are extended by means of “assurance region” approaches which are less demanding in their information requirements and underlying assumptions.
Abstract: The extensions, new developments and new interpretations for DEA covered in this paper include: (1) new measures of efficiency, (2) new models and (3) new ways of implementing established models with new results and interpretations presented that include treatments of “congestion”, “returns-to-scale” and “mix” and “technical” inefficiencies and measures of efficiency that can be used to reflect all pertinent properties. Previously used models, such as those used to identify “allocative inefficiencies”, are extended by means of “assurance region” approaches which are less demanding in their information requirements and underlying assumptions. New opportunities for research are identified in each section of this chapter. Sources of further developments and possible sources for further help are also suggested with references supplied to other papers that appear in this volume and which are summarily described in this introductory chapter.

230 citations


Journal ArticleDOI
TL;DR: DEA (Data Envelopment Analysis) models and concepts are formulated here in terms of the "P-Models" of Chance Constrained Programming, which are modified to contact the "satisficing concepts" of H.A. Simon, adding as a third category to the efficiency/inefficiency dichotomies that have heretofore prevailed in DEA.
Abstract: DEA (Data Envelopment Analysis) models and concepts are formulated here in terms of the “P-Models” of Chance Constrained Programming, which are then modified to contact the “satisficing concepts” of H.A. Simon. Satisficing is thereby added as a third category to the efficiency/inefficiency dichotomies that have heretofore prevailed in DEA. Formulations include cases in which inputs and outputs are stochastic, as well as cases in which only the outputs are stochastic. Attention is also devoted to situations in which variations in inputs and outputs are related through a common random variable. Extensions include new developments in goal programming with deterministic equivalents for the corresponding satisficing models under chance constraints.

217 citations


Journal ArticleDOI
TL;DR: This paper describes a general approach which promises good performance for a fairly extensive class of problems by hybridizing the GA with existing simple heuristics, and raises the possibility of blending GAs with orthodox mathematical programming procedures.
Abstract: The genetic algorithm (GA) paradigm has attracted considerable attention as a promising heuristic approach for solving optimization problems. Much of the development has related to problems of optimizing functions of continuous variables, but recently there have been several applications to problems of a combinatorial nature. What is often found is that GAs have fairly poor performance for combinatorial problems if implemented in a naive way, and most reported work has involved somewhat ad hoc adjustments to the basic method. In this paper, we will describe a general approach which promises good performance for a fairly extensive class of problems by hybridizing the GA with existing simple heuristics. The procedure will be illustrated mainly in relation to the problem ofbin-packing, but it could be extended to other problems such asgraph partitioning, parallel-machine scheduling andgeneralized assignment. The method is further extended by usingproblem size reduction hybrids. Some results of numerical experiments will be presented which attempt to identify those circumstances in which these heuristics will perform well relative to exact methods. Finally, we discuss some general issues involving hybridization: in particular, we raise the possibility of blending GAs with orthodox mathematical programming procedures.

131 citations


Journal ArticleDOI
TL;DR: Three variants of the basic simulated annealing implementation which are designed to overcome the multi-objective examination timetabling problem are proposed and compared using real university data as well as artificial data sets.
Abstract: This paper is concerned with the use of simulated annealing in the solution of the multi-objective examination timetabling problem. The solution method proposed optimizes groups of objectives in different phases. Some decisions from earlier phases may be altered later as long as the solution quality with respect to earlier phases does not deteriorate. However, such limitations may disconnect the solution space, thereby causing optimal or near-optimal solutions to be missed. Three variants of our basic simulated annealing implementation which are designed to overcome this problem are proposed and compared using real university data as well as artificial data sets. The underlying principles and conclusions stemming from the use of this method are generally applicable to many other multi-objective type problems.

Journal ArticleDOI
TL;DR: A model of primary care performance which is based on the premise that certain measurable quality indicators can act as proxies for outcome is set out, and it is argued that DEA can handle multiple dimensions of performance more comfortably, and is less vulnerable to the misspecification bias that afflicts statistically based models.
Abstract: The performance of primary care should ultimately be judged on its effect on the health outcome of individual patients. However, for the foreseeable future, it is inconceivable that the outcome data necessary to come to a judgement on performance will be available. And in any case, specification of the statistical model necessary to analyze outcome is fraught with difficulty. This paper therefore sets out a model of primary care performance which is based on the premise that certain measurable quality indicators can act as proxies for outcome. This being the case, a model of performance can be deduced which takes into account the effect of resources and patient characteristics on outcome. The most appropriate analytic technique to make this model operational is data envelopment analysis (DEA). It is argued that DEA can handle multiple dimensions of performance more comfortably, and is less vulnerable to the misspecification bias that afflicts statistically based models. The issues are illustrated with an example from English Family Health Service Authorities.

Journal ArticleDOI
TL;DR: The algorithm does not use any Big-M initial point and achieves O(\sqrt {nL} )$$ -iteration complexity, wheren andL are the number of variables and the length of data of the LP problem, and detects LP infeasibility based on a provable criterion.
Abstract: We present a simplification and generalization of the recent homogeneous and self-dual linear programming (LP) algorithm. The algorithm does not use any Big-M initial point and achieves\(O(\sqrt {nL} )\)-iteration complexity, wheren andL are the number of variables and the length of data of the LP problem. It also detects LP infeasibility based on a provable criterion. Its preliminary implementation with a simple predictor and corrector technique results in an efficient computer code in practice. In contrast to other interior-point methods, our code solves NETLIB problems, feasible or infeasible, starting simply fromx=e (primal variables),y=0 (dual variables),z=e (dual slack variables), wheree is the vector of all ones. We describe our computational experience in solving these problems, and compare our results with OB1.60, a state-of-the-art implementation of interior-point algorithms.

Journal ArticleDOI
TL;DR: The application of a tabu search algorithm for solving the frequency assignment problem is presented, which is efficient, robust and stable and gives solutions which compare more favourably than ones obtained using a genetic algorithm.
Abstract: This paper presents the application of a tabu search algorithm for solving the frequency assignment problem. This problem, known to be NP-hard, is to find an assignment of frequencies for a number of communication links, which satisfy various constraints. We report on our computational experiments in terms of computational efficiency and quality of the solutions obtained for realistic, computer-generated problem instances. The method is efficient, robust and stable and gives solutions which compare more favourably than ones obtained using a genetic algorithm.

Journal ArticleDOI
TL;DR: A generalized Data Envelopment Analysis (DEA) model is introduced which unifies and extends most of the well-known DEA models developed over the past fifteen years and points the way to new models.
Abstract: In this paper, we introduce a generalized Data Envelopment Analysis (DEA) model which unifies and extends most of the well-known DEA models developed over the past fifteen years and points the way to new models. By setting three binary parameters of this model to different values, we obtain subclasses of the DEA models with generalK cone andW cone descriptions to represent the evaluator's preferences for the Decision Making Units (DMU) and the input/output categories. We also show relationships among the various different subclasses of the generalized DEA model and give special attention to efficiency definitions and solutions. Furthermore, we state and rigorously prove the equivalence between DEA efficiency and the nondominated solutions of a corresponding multi-objective program. This latter result is especially important for understanding and interpreting the concept of efficiency. Detailed examples are also presented to demonstrate the functions ofK cone andW cone, as well as their characteristics.

Journal ArticleDOI
TL;DR: In this article, a novel algorithm for the global optimization of functions (C-RTS) is presented, in which a combinatorial optimization method cooperates with a stochastic local minimizer.
Abstract: A novel algorithm for the global optimization of functions (C-RTS) is presented, in which a combinatorial optimization method cooperates with a stochastic local minimizer. The combinatorial optimization component, based on the Reactive Tabu Search recently proposed by the authors, locates the most promising “boxes”, in which starting points for the local minimizer are generated. In order to cover a wide spectrum of possible applications without user intervention, the method is designed with adaptive mechanisms: the box size is adapted to the local structure of the function to be optimized, the search parameters are adapted to obtain a proper balance of diversification and intensification. The algorithm is compared with some existing algorithms, and the experimental results are presented for a variety of benchmark tasks.

Journal ArticleDOI
TL;DR: An enhanced Benders decomposition algorithm is developed for solving multistage stochastic linear programs and includes warm start basis selection, preliminary cut generation, the multicut procedure, and decision tree traversing strategies.
Abstract: Handling uncertainty in natural inflow is an important part of a hydroelectric scheduling model. In a stochastic programming formulation, natural inflow may be modeled as a random vector with known distribution, but the size of the resulting mathematical program can be formidable. Decomposition-based algorithms take advantage of special structure and provide an attractive approach to such problems. We develop an enhanced Benders decomposition algorithm for solving multistage stochastic linear programs. The enhancements include warm start basis selection, preliminary cut generation, the multicut procedure, and decision tree traversing strategies. Computational results are presented for a collection of stochastic hydroelectric scheduling problems.

Journal ArticleDOI
TL;DR: The results indicate that the combined approach and its modified versions are better than either of the pure strategies as well as the heuristic algorithm.
Abstract: In this paper, we study the application of a meta-heuristic to a two-machine flowshop scheduling problem. The meta-heuristic uses a branch-and-bound procedure to generate some information, which in turn is used to guide a genetic algorithm's search for optimal and near-optimal solutions. The criteria considered are makespan and average job flowtime. The problem has applications in flowshop environments where management is interested in reducing turn-around and job idle times simultaneously. We develop the combined branch-and-bound and genetic algorithm based procedure and two modified versions of it. Their performance is compared with that of three algorithms: pure branch-and-bound, pure genetic algorithm, and a heuristic. The results indicate that the combined approach and its modified versions are better than either of the pure strategies as well as the heuristic algorithm.

Journal ArticleDOI
TL;DR: In evaluating efficiency, DEA generally shows superior performance, with BCC models being best (except at “corner points”), followed by the CCR model and then by COLS, with log-linear regressions performing better than their translog counterparts at almost all sample sizes.
Abstract: Using statistically designed experiments, 12,500 observations are generated from a “4-pieced Cobb-Douglas function” exhibiting increasing and decreasing returns to scale in its different pieces. Performances of DEA and frontier regressions represented by COLS (Corrected Ordinary Least Squares) are compared at sample sizes ofn=50, 100, 150 and 200. Statistical consistency is exhibited, with performances improving as sample sizes increase. Both DEA and COLS generally give good results at all sample sizes. In evaluating efficiency, DEA generally shows superior performance, with BCC models being best (except at “corner points”), followed by the CCR model and then by COLS, with log-linear regressions performing better than their translog counterparts at almost all sample sizes. Because of the need to consider locally varying behavior, only the CCR and translog models are used for returns to scale, with CCR being the better performer. An additional set of 7,500 observations were generated under conditions that made it possible to compare efficiency evaluations in the presence of collinearity and with model misspecification in the form of added and omitted variables. Results were similar to the larger experiment: the BCC model is the best performer. However, COLS exhibited surprisingly good performances — which suggests that COLS may have previously unidentified robustness properties — while the CCR model is the poorest performer when one of the variables used to generate the observations is omitted.

Journal ArticleDOI
TL;DR: It is demonstrated that tabu search is superior to other solution approaches for the uniform graph partitioning problem both with respect to solution quality and computational requirements.
Abstract: In this paper, we develop a tabu search procedure for solving the uniform graph partitioning problem. Tabu search, an abstract heuristic search method, has been shown to have promise in solving several NP-hard problems, such as job shop and flow shop scheduling, vehicle routing, quadratic assignment, and maximum satisfiability. We compare tabu search to other heuristic procedures for graph partitioning, and demonstrate that tabu search is superior to other solution approaches for the uniform graph partitioning problem both with respect to solution quality and computational requirements.

Journal ArticleDOI
TL;DR: New combinations of Data Envelopment Analysis (DEA) and statistical approaches that can be used to evaluate efficiency within a multiple-input multiple-output framework are examined, consistent with what might be expected from economic theory and informative for educational policy uses.
Abstract: This paper examines new combinations of Data Envelopment Analysis (DEA) and statistical approaches that can be used to evaluate efficiency within a multiple-input multiple-output framework. Using data on five outputs and eight inputs for 638 public secondary schools in Texas, unsatisfactory results are obtained initially from both Ordinary Least Squares (OLS) and Stochastic Frontier (SF) regressions run separately using one output variable at-a-time. Canonical correlation analysis is then used to aggregate the multiple outputs into a single “aggregate” output, after which separate regressions are estimated for the subsets of schools identified as efficient and inefficient by DEA. Satisfactory results are finally obtained by a joint use of DEA and statistical regressions in the following manner. DEA is first used to identify the subset of DEA-efficient schools. The entire collection of schools is then comprehended in a single regression with dummy variables used to distinguish between DEA-efficient and DEA-inefficient schools. The input coefficients are positive for the efficient schools and negative and statistically significant for the inefficient schools. These results are consistent with what might be expected from economic theory and are informative for educational policy uses. They also extend the treatments of production functions usually found in the econometrics literature to obtain one regression relation that can be used to evaluate both efficient and inefficient behavior.

Journal ArticleDOI
TL;DR: A general decomposition framework for large convex optimization problems based on augmented Lagrangians is described and the approach is applied to multistage stochastic programming problems in two different ways: by decompose the problem into scenarios and by decomposing it into nodes corresponding to stages.
Abstract: A general decomposition framework for large convex optimization problems based on augmented Lagrangians is described. The approach is then applied to multistage stochastic programming problems in two different ways: by decomposing the problem into scenarios and by decomposing it into nodes corresponding to stages. Theoretical convergence properties of the two approaches are derived and a computational illustration is presented.

Journal ArticleDOI
TL;DR: This work studies and compares asynchronous parallelization strategies for tabu search, and evaluates the impact on performance and solution quality of some important algorithmic design parameters: number of processors, handling of exchanged information, etc.
Abstract: We study and compare asynchronous parallelization strategies for tabu search, and evaluate the impact on performance and solution quality of some important algorithmic design parameters: number of processors, handling of exchanged information, etc. Parallelization approaches are implemented and compared by using a tabu search algorithm for multicommodity location-allocation problems with balancing requirements.

Journal ArticleDOI
TL;DR: An algorithm that determines the convex hull of many objective functions in the class is covered, instead of only one-dimensional versions, and the algorithm is faster than the one in the previous paper.
Abstract: We consider the objective function of a simple integer recourse problem with fixed technology matrix and discretely distributed right-hand sides. Exploiting the special structure of this problem, we devise an algorithm that determines the convex hull of this function efficiently. The results are improvements over those in a previous paper. In the first place, the convex hull of many objective functions in the class is covered, instead of only one-dimensional versions. In the second place, the algorithm is faster than the one in the previous paper. Moreover, some new results on the structure of the objective function are presented.

Journal ArticleDOI
TL;DR: A cutting plane algorithm for solving the linear ordering problem is described, which uses the primaldual interior point method to solve the linear programming relaxations.
Abstract: Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primal-dual interior point method to solve the linear programming relaxations. A point which is a good warm start for a simplex-based cutting plane algorithm is generally not a good starting point for an interior point method. Techniques used to improve the warm start include attempting to identify cutting planes early and storing an old feasible point, which is used to help recenter when cutting planes are added. Computational results are described for some real-world problems; the algorithm appears to be competitive with a simplex-based cutting plane algorithm.

Journal ArticleDOI
TL;DR: It is suggested that collective units had a better performance than state-owned units in the two consecutive years analyzed and showed that some AR-efficient factories were more flexible in adopting the mixture of central planning and market economies that China currently is trying to use.
Abstract: This article employs new data envelopment analysis/assurance region (DEA/AR) methods to evaluate the efficiency of the 35 textile factories of the Nanjing Textiles Corporation (NTC), Nanjing, China. The returns to scale (RTS) of these factories were studied without assuming that the optimal DEA solutions were unique. All DMUs are identified with pointsE (Extreme Efficient),E′ (Efficient but not an extreme point) andF (Frontier but not efficient). We then further identify the nonfrontier DMUs with pointsNE, NE′ andNF according to whether they are projected onto a point inE, E′, orF en route to evaluating their performances. All of the inefficient factories were in classNF and had unique optimal primal-dual solution pairs. Consequently, the solution pairs satisfy the strong complementary slackness condition (SCSC). Application of cone-ratio (CR) ARs reduced significantly the number of factories in classE, and showed that some AR-efficient factories were more flexible in adopting the mixture of central planning and market economies that China currently is trying to use. Also, linked-cone (LC) ARs were applied to measure maximum and minimum profit ratios. The SCSC multiplier space approach was utilized to analyze the sensitivity of the efficiency results to potential errors in the data with and without ARs. The results in this article suggest that collective units had a better performance than state-owned units in the two consecutive years analyzed.

Journal ArticleDOI
TL;DR: Analysis of physician practice patterns in a Health Maintenance Organization (HMO) using single and multi-stage applications of Data Envelopment Analysis (DEA) suggests specific new paths which may prove effective at reducing health care costs within managed care organizations.
Abstract: Physician practice patterns in a Health Maintenance Organization (HMO) are analyzed using single and multi-stage applications of Data Envelopment Analysis (DEA). Best practice (BP) patterns are identified, which can serve as benchmark targets for inefficient physicians. Results suggest three health policy — resource utilization control strategies: The results suggest specific new paths which may prove effective at reducing health care costs within managed care organizations, the health care providers most likely to dominate the U.S. health system in the future. A multi-stage DEA technique is used to locate specific types of inefficient physicians. Methods to test the clinical viability of using DEA to realize the potential cost savings and extensions of this research are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors investigate two dynamic strategies, the reverse elimination method and the cancellation sequence method, for the traveling purchaser problem, a generalization of the classical traveling salesman problem.
Abstract: Tabu search is a metastrategy for guiding known heuristics to overcome local optimality with a large number of successful applications reported in the literature. In this paper we investigate two dynamic strategies, the reverse elimination method and the cancellation sequence method. The incorporation of strategic oscillation as well as a combination of these methods are developed. The impact of the different methods is shown with respect to the traveling purchaser problem, a generalization of the classical traveling salesman problem. The traveling purchaser problem is the problem of determining a tour of a purchaser buying several items in different shops by minimizing the total amount of travel and purchase costs. A comparison of the tabu search strategies with a simulated annealing approach is presented, too.

Journal ArticleDOI
TL;DR: A contamination technique is presented as a numerically tractable tool to post-optimization and analysis of robustness of the optimal value of scenario-based stochastic programs and of the expected value problems.
Abstract: A contamination technique is presented as a numerically tractable tool to post-optimization and analysis of robustness of the optimal value of scenario-based stochastic programs and of the expected value problems Detailed applications of the method concern the two-stage stochastic linear programs with random recourse and the corresponding robust optimization problems

Journal ArticleDOI
TL;DR: Details and results of a simulation study realized in the Surgical Emergency Department at Istanbul University School of Medicine are included to suggest new bed capacities to improve the current system, and also to provide the management with guidelines for their expansion plans.
Abstract: Due to its highly stochastic nature and complex interaction between services involved, health care has been a demanding area of application for computer simulation. This paper includes details and results of a simulation study realized in the Surgical Emergency Department at Istanbul University School of Medicine. The purpose is to suggest new bed capacities to improve the current system, and also to provide the management with guidelines for their expansion plans. For this aim, arrival rates, treatment procedures, inpatient admittance, and service durations have been carefully analyzed and modeled. The model, coded in SLAM-II simulation language, has been run under several bed capacity scenarios, and resulting queueing and waiting patterns have been discussed in detail.