scispace - formally typeset
Search or ask a question

Showing papers on "Set cover problem published in 2002"


Journal ArticleDOI
TL;DR: In this article, the problem of designing spatially cohesive nature reserve systems that meet biodiversity objectives is formulated as a nonlinear integer programming problem, where the multiobjective function minimises a combination of boundary length, area and failed representation of the biological attributes we are trying to conserve.
Abstract: The problem of designing spatially cohesive nature reserve systems that meet biodiversity objectives is formulated as a nonlinear integer programming problem. The multiobjective function minimises a combination of boundary length, area and failed representation of the biological attributes we are trying to conserve. The task is to reserve a subset of sites that best meet this objective. We use data on the distribution of habitats in the Northern Territory, Australia, to show how simulated annealing and a greedy heuristic algorithm can be used to generate good solutions to such large reserve design problems, and to compare the effectiveness of these methods.

269 citations


Journal ArticleDOI
TL;DR: New randomized distributed algorithms for the dominating set problem are described and analyzed that run in polylogarithmic time, independent of the diameter of the network, and that return a dominating set of size within a logarithic factor from optimal, with high probability.
Abstract: The dominating set problem asks for a small subset D of nodes in a graph such that every node is either in D or adjacent to a node in D. This problem arises in a number of distributed network applications, where it is important to locate a small number of centers in the network such that every node is nearby at least one center. Finding a dominating set of minimum size is NP-complete, and the best known approximation is logarithmic in the maximum degree of the graph and is provided by the same simple greedy approach that gives the well-known logarithmic approximation result for the closely related set cover problem.We describe and analyze new randomized distributed algorithms for the dominating set problem that run in polylogarithmic time, independent of the diameter of the network, and that return a dominating set of size within a logarithmic factor from optimal, with high probability. In particular, our best algorithm runs in O(log n log Δ) rounds with high probability, where n is the number of nodes, Δ is one plus the maximum degree of any node, and each round involves a constant number of message exchanges among any two neighbors; the size of the dominating set obtained is within O (log Δ) of the optimal in expectation and within O(log n) of the optimal with high probability. We also describe generalizations to the weighted case and the case of multiple covering requirements.

226 citations


Journal ArticleDOI
TL;DR: This paper presents a new type of genetic algorithm for the set covering problem that is an indirect algorithm, ie the actual solutions are found by an external decoder function, and it is shown that results can be further improved by adding another indirect optimisation layer.
Abstract: This paper presents a new type of genetic algorithm for the set covering problem. It differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function. The genetic algorithm itself provides this decoder with permutations of the solution variables and other parameters. Second, it will be shown that results can be further improved by adding another indirect optimisation layer. The decoder will not directly seek out low cost solutions but instead aims for good exploitable solutions. These are then post optimised by another hill-climbing algorithm. Although seemingly more complicated, we will show that this three-stage approach has advantages in terms of solution quality, speed and adaptability to new types of problems over more direct approaches. Extensive computational results are presented and compared to the latest evolutionary and other heuristic approaches to the same data instances.

120 citations


Journal ArticleDOI
TL;DR: In this article, the structure of the set of probabilistically efficient points of binary random vectors is analyzed, and a branch-and-bound algorithm for probabilistic set-covering problems is proposed.
Abstract: In a probabilistic set-covering problem the right-hand side is a random binary vector and the covering constraint has to be satisfied with some prescribed probability. We analyze the structure of the set of probabilistically efficient points of binary random vectors, develop methods for their enumeration, and propose specialized branch-and-bound algorithms for probabilistic set-covering problems.

104 citations


Proceedings ArticleDOI
16 Nov 2002
TL;DR: This paper considers the classical vertex cover and set cover problems with the addition of hard capacity constraints and gives a 3-approximation algorithm which is based on randomized rounding with alterations and proves that the weighted version is at least as hard as the set cover problem.
Abstract: We consider the classical vertex cover and set cover problems with the addition of hard capacity constraints. This means that a set (vertex) can only cover a limited number of its elements (adjacent edges) and the number of available copies of each set (vertex) is bounded. This is a natural generalization of the classical problems that also captures resource limitations in practical scenarios. We obtain the following results. For the unweighted vertex cover problem with hard capacities we give a 3-approximation algorithm which is based on randomized rounding with alterations. We prove that the weighted version is at least as hard as the set cover problem. This is an interesting separation between the approximability of weighted and unweighted versions of a "natural" graph problem. A logarithmic approximation factor for both the set cover and the weighted vertex cover problem with hard capacities follows from the work of Wolsey (1982) on submodular set cover. We provide in this paper a simple and intuitive proof for this bound.

96 citations


Journal ArticleDOI
TL;DR: These approximation algorithms transform the terrain guarding instance into a MINIMUM SET COVER instance, which is then solved by the standard greedy approximation algorithm, which achieves approximation ratios of O(logn), where n is the number of vertices in the input terrain.

73 citations


Book ChapterDOI
TL;DR: It is shown that a greedy algorithm approximates min sum set cover within a ratio of 4, and it is NP-hard to approximate within some constant ?
Abstract: The input to the min sum set cover problem is a collection of n sets that jointly cover m elements. The output is a linear order on the sets, namely, in every time step from 1 to n exactly one set is chosen. For every element, this induces a first time step by which it is covered. The objective is to find a linear arrangement of the sets that minimizes the sum of these first time steps over all elements.We show that a greedy algorithm approximates min sum set cover within a ratio of 4. This result was implicit in work of Bar-Noy, Bellare, Halldorsson, Shachnai and Tamir (1998) on chromatic sums, but we present a simpler proof. We also show that for every ? > 0, achieving an approximation ratio of 4 - ? is NP-hard. For the min sum vertex cover version of the problem, we show that it can be approximated within a ratio of 2, and is NP-hard to approximate within some constant ? > 1.

72 citations


Journal ArticleDOI
TL;DR: Experimental results obtained with a binary representation of the SCP, show that the parallel genetic algorithm PGA performs better than the sequential model in terms of the number of generations needed to achieve solutions of an acceptable quality.

63 citations


Book ChapterDOI
TL;DR: This paper presents a hybrid approach based on Ant Systems combined with a local search for solving the Set Covering Problem, and an empirical adjustment of its parameters has been realized.
Abstract: The Set Covering Problem (SCP) represents an important class of NP-hard combinatorial optimization problems. Actually, the exact algorithms, such as branch and bound, don't find optimal solutions within a reasonable amount of time, except for small problems. Recently, the Ant systems (AS) were proposed for solving several combinatorial optimization problems. This paper presents a hybrid approach based on Ant Systems combined with a local search for solving the SCP. To validate the implemented approach, many tests have been realized on known benchmarks, and, an empirical adjustment of its parameters has been realized. To improve the performance of this algorithm, two parallel implementations are proposed.

42 citations


Journal ArticleDOI
TL;DR: A probabilistic greedy search method for combinatorial optimisation problems and shown to yield a simple, robust, and quite fast heuristic for Set Covering Problem (SCP).
Abstract: We present a probabilistic greedy search method for combinatorial optimisation problems. This approach is implemented and evaluated for the Set Covering Problem (SCP) and shown to yield a simple, robust, and quite fast heuristic. Tests performed on a large set of benchmark instances with up to 1000 rows and 10 000 columns show that the algorithm consistently yields near-optimal solutions.

42 citations


Book ChapterDOI
TL;DR: The priority algorithm framework introduced by Borodin, Nielsen and Rackoff is applied and extended to define "greedy-like" algorithms for (uncapacitated) facility location and set cover, and hence applies to algorithms that are not necessarily polynomial-time.
Abstract: We apply and extend the priority algorithm framework introduced by Borodin, Nielsen and Rackoff to define "greedy-like" algorithms for (uncapacitated) facility location and set cover. These problems have been the focus of extensive research from the point of view of approximation algorithms, and for both problems greedy algorithms have been proposed and analyzed. The priority algorithm definitions are general enough so as to capture a broad class of algorithms that can be characterized as "greedy-like" while still possible to derive non-trivial lower bounds on the approximability of the problems. Our results are orthogonal to complexity considerations, and hence apply to algorithms that are not necessarily polynomial-time.

Book ChapterDOI
TL;DR: This analysis shows that there exist several classes of set covering instances that show a largely different behavior, and proposes new ways of generating core problems and analyzing the performance of algorithms exploiting these core problems.
Abstract: The set covering problem is an NP-hard combinatorial optimization problem that arises in applications ranging from crew scheduling in airlines to driver scheduling in public mass transport. In this paper we analyze search space characteristics of a widely used set of benchmark instances through an analysis of the fitness-distance correlation. This analysis shows that there exist several classes of set covering instances that show a largely different behavior. For instances with high fitness distance correlation, we propose new ways of generating core problems and analyze the performance of algorithms exploiting these core problems.

Journal ArticleDOI
15 Nov 2002
TL;DR: This tutorial presents the state of the art and open problems on the set covering problem, where polyhedral combinatorics and graph theory come together in this rich area of discrete mathematics.
Abstract: The Operations Research model known as the Set Covering Problem has a wide range of applications. See for example the survey by Ceria, Nobili and Sassano and edited by Dell'Amico, Maffioli and Martello (Annotated Bibliographies in Combinatorial Optimization, Wiley, New York, 1997). Sometimes, due to the special structure of the constraint matrix, the natural linear programming relaxation yields an optimal solution that is integer, thus solving the problem. Under which conditions do such integrality properties hold? This question is of both theoretical and practical interest. On the theoretical side, polyhedral combinatorics and graph theory come together in this rich area of discrete mathematics. In this tutorial, we present the state of the art and open problems on this question.

Journal ArticleDOI
TL;DR: A new method of analysis is used that combines ideas from the greedy algorithm for set cover with a matroid-style exchange argument to model the connectivity constraint and is able to provide a tight instance of Steiner tree problem in graphs.

01 Jan 2002
TL;DR: This thesis designs approximation algorithms for NP-hard covering problems in graphs by unearthing polyhedral roots to better understood problems, and presents a ½-integral relaxation for the General Factor Problem, one of the most general matching problems whose optimization complexity is unresolved.
Abstract: The motivation of this thesis is twofold: (i) designing approximation algorithms for NP-hard covering problems in graphs by unearthing polyhedral roots to better understood problems, and (ii) a dual approach in which we hope to expand the foundation of well-understood polyhedra. Our design of approximation algorithms focuses on the Edge-dominating Set Problem, that of covering the edges of an edge-weighted graph, ( V, E), with its edges at minimum cost. This problem captures the notion of covering in graphs in the sense that it is a common generalization of the well-studied Vertex Cover (cover E using V) and Edge Cover (cover V using E) problems and can be used to model a specialization of the Total Cover Problem (cover V ∪ E using V ∪ E). To the best of our knowledge, we present among the first nontrivial approximation algorithms for this problem, culminating with a 2-approximation. This is tight in the sense that it matches the approximability of its specialization, Vertex Cover, which has eluded a constant-factor improvement in approximability for decades. We also present approximation algorithms for generalizations and variants including the problems in which edges have demands and capacities or we seek an edge-dominating set that is connected or a closed walk. Our approximation algorithms are simply stated; however, our analysis relies intimately on a polyhedral understanding of related problems from matching theory. In an attempt to expand this frontier, we explore several polyhedra corresponding to variants of the Edge Cover Problem, a matching problem in disguise, and present a ½-integral relaxation for the General Factor Problem, one of the most general matching problems whose optimization complexity is unresolved.

Proceedings ArticleDOI
12 May 2002
TL;DR: Taguchi's orthogonal experimental design is applied to the generalisation of the approach to the class of set covering, which is basically to cover the rows of a zero-one matrix with a subset of columns at minimal cost.
Abstract: The set covering problem has a very wide area of applications, and scheduling is one of its most important applications. At CEC2001, a fuzzy simulated evolution algorithm was reported for a set covering problem in bus and rail driver scheduling. This paper reports on the generalisation of the approach to the class of set covering, which is basically to cover the rows of a zero-one matrix with a subset of columns at minimal cost. The simulated evolution method iteratively transforms a single solution treating its selected columns as a population. At each iteration a portion of the population is probabilistic ally, biased towards selecting the 'weaker' columns, discarded; and the broken solution is then reconstructed by means of a greedy heuristic. The columns are evaluated under several criteria formulated based on fuzzy set theory. The set of weights associated with the evaluation criteria were previously either fixed in advance or calibrated by means of a simplified genetic algorithm. In this paper, we consider some discrete levels of values instead of a continuous range from 0 to 1 for each weight. We have also included two other parameters driving the evolutionary process as additional factors having some discrete levels of values. Taguchi's orthogonal experimental design is applied. This has the effect of comprehensively evaluating the combinations of factors, although only a small fraction of the possible combinations is explicitly experimented. Better results have been obtained, and comparisons with those previously reported are discussed.

Proceedings ArticleDOI
22 Sep 2002
TL;DR: A hybrid solver architecture is proposed that combines the raw speed of instance-specific reconfigurable hardware with flexible bounding schemes implemented in software.
Abstract: We present instance-specific custom computing machines for the set covering problem. Four accelerator architectures are developed that implement branch & bound in 3-valued logic and many of the deduction techniques found in software solvers. We use set covering benchmarks from two-level logic minimization and Steiner triple systems to derive and discuss experimental results. The resulting raw speedups are in the order of four magnitudes on average. Finally, we propose a hybrid solver architecture that combines the raw speed of instance-specific reconfigurable hardware with flexible bounding schemes implemented in software.

Journal ArticleDOI
01 Oct 2002-Networks
TL;DR: The mountain property on the coefficients in each column of the constraint matrix of a minimax problem and its generalization—the bitonic property are defined and it is shown that these are equivalent, respectively, to set cover problems with consecutive 1's in eachColumn or circular 1'sin each column thus solving the periodic scheduling problem.
Abstract: The minimax problem is a new optimization problem which substitutes maximum for addition in the constraint inequalities of a linear program. We show how a problem with n variables and m constraints can be reduced to a set cover problem with nm variables and m constraints. We define the mountain property on the coefficients in each column of the constraint matrix of a minimax problem and its generalization—the bitonic property. It is shown that these are equivalent, respectively, to set cover problems with consecutive 1's in each column or circular 1's in each column thus solving the periodic scheduling problem. We present a shortest path algorithm to solve a minimax problem with the mountain property in time O(mn + log m). For bitonic matrix problems, we present an algorithm of complexity O(m2(n + log n)). The same algorithms are used to solve a set cover problem on v sets and r elements to be covered in O(v + r log r) for a problem with consecutive 1's in each column and in O(v(v + r log r)) for a problem with circular 1's in each column. We further establish that r is at least and at most O(v). We also provide an efficient algorithm for recognizing bitonic matrices in O(mn log m) time. © 2002 Wiley Periodicals, Inc.

Book ChapterDOI
01 Jan 2002
TL;DR: An inductive learning method is proposed, IP2, to derive classification rules that correctly describe most of the examples belonging to a class and do not describe mostof the examples not belonging to this class.
Abstract: In this paper we propose an inductive learning method, IP2, to derive classification rules that correctly describe most of the examples belonging to a class and do not describe most of the examples not belonging to this class. A pre-analysis of data is included that assigns higher weights to those values of attributes which occur more often in the positive than in the negative examples. The inductive learning problem is represented as a modification of the set covering problem which are solved by an integer programming based algorithm using elements of a greedy algorithm or a genetic algorithm. The results are very encouraging and are illustrated on thyroid cancer and coronary heard disease problems.

DissertationDOI
01 Jan 2002
TL;DR: This thesis presents and analyzes the performance of a model-based approach to view planning for automated object reconstruction and presents a new theoretical framework for the view planning problem, which provides an explicit mathematical formulation amenable to a well understood set of exact and approximate solution techniques.
Abstract: There is a growing demand for 3D virtual models of complex physical objects in a wide range of applications. Thus far, the task of reconstructing high quality object models with range cameras (a shape measurement sensor) has been a manual, time consuming process. There is a growing demand to automate the model building process. The key open problem is to find an accurate, robust and efficient view planning algorithm—that is, the process of determining a suitable set of sensor viewpoints and imaging parameters for a specified reconstruction task with a given imaging environment. This thesis presents and analyzes the performance of a model-based approach to view planning for automated object reconstruction. The view planning approach is “performance-oriented” in the sense that it is based on a set of explicit quality requirements expressed in a model specification and incorporates performance models for the range camera and positioning system. A new theoretical framework for the view planning problem is presented as an instance of the set covering problem with a registration constraint and is formulated as an integer programming problem. The theoretical framework facilitates a more intuitive understanding of the nature of view planning. It also provides an explicit mathematical formulation amenable to a well understood set of exact and approximate solution techniques. The measurability impact of pose error is studied in detail and counter-measures are presented to mitigate its effects. Methods for sparse sampling of object surface space and viewpoint space are presented and their performance is characterized.

Dissertation
01 Jan 2002
TL;DR: In this article, two evolutionary algorithms, namely a Genetic Algorithm and a Simulated Evolution algorithm, attempting to model and solve the driver scheduling problem in new ways, are presented. But their main objectives are to minimize the total number of shifts and the total shift cost.
Abstract: Bus and train driver scheduling is a process of partitioning blocks of work, each of which is serviced by one vehicle, into a set of legal driver shifts. The main objectives are to minimise the total number of shifts and the total shift cost. Restrictions imposed by logistic, legal and union agreements make the problem more complicated. The generate-and-select approach is widely used. A large set of feasible shifts is generated first, and then a subset is selected, from the large set, to form a final schedule by the mathematical programming method. In the subset selection phase, computational difficulties exist because of the NP-hard nature of this combinatorial optimisation problem. This thesis presents two evolutionary algorithms, namely a Genetic Algorithm and a Simulated Evolution algorithm, attempting to model and solve the driver scheduling problem in new ways. At the heart of both algorithms is a function for evaluating potential driver shifts under fuzzified criteria. A Genetic Algorithm is first employed to calibrate the weight distribution among fuzzy membership functions. A Simulated Evolution algorithm then mimics generations of evolution on the single schedule produced by the Genetic Algorithm. In each generation an unfit portion of the working schedule is removed. The broken schedule is then reconstructed by means of a greedy algorithm, using the weight distribution derived by the Genetic Algorithm. The basic Simulated Evolution algorithm is a greedy search strategy that achieves improvement through iterative perturbation and reconstruction. This approach has achieved success in solving driver scheduling problems from different companies, with comparable results to the previously best known solutions. Finally, the Simulated Evolution algorithm for driver scheduling has been generalized for the set covering problem, without using any special domain knowledge. This shows that this research is valuable to many applications that can be formulated as set covering models. Furthermore, Taguchi's orthogonal experimental design method has been used for the parameter settings. Computational results have shown that for large-scale problems, in general the proposed approach can produce superior solutions much faster than some existing approaches. This approach is particularly suitable for situations where quick and high-quality solutions are desirable.

Posted Content
TL;DR: In this article, a new type of genetic algorithm for the set covering problem is presented, which differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function.
Abstract: This paper presents a new type of genetic algorithm for the set covering problem. It differs from previous evolutionary approaches first because it is an indirect algorithm, i.e. the actual solutions are found by an external decoder function. The genetic algorithm itself provides this decoder with permutations of the solution variables and other parameters. Second, it will be shown that results can be further improved by adding another indirect optimisation layer. The decoder will not directly seek out low cost solutions but instead aims for good exploitable solutions. These are then post optimised by another hill-climbing algorithm. Although seemingly more complicated, we will show that this three-stage approach has advantages in terms of solution quality, speed and adaptability to new types of problems over more direct approaches. Extensive computational results are presented and compared to the latest evolutionary and other heuristic approaches to the same data instances.

Proceedings ArticleDOI
TL;DR: In this paper, the authors give a greedy algorithm for mixed packing and covering, which achieves an epsilon-approximate solution in O(epsilon^-2 log m) linear time.
Abstract: Mixed packing and covering problems are problems that can be formulated as linear programs using only non-negative coefficients. Examples include multicommodity network flow, the Held-Karp lower bound on TSP, fractional relaxations of set cover, bin-packing, knapsack, scheduling problems, minimum-weight triangulation, etc. This paper gives approximation algorithms for the general class of problems. The sequential algorithm is a simple greedy algorithm that can be implemented to find an epsilon-approximate solution in O(epsilon^-2 log m) linear-time iterations. The parallel algorithm does comparable work but finishes in polylogarithmic time. The results generalize previous work on pure packing and covering (the special case when the constraints are all "less-than" or all "greater-than") by Michael Luby and Noam Nisan (1993) and Naveen Garg and Jochen Konemann (1998).

Journal ArticleDOI
TL;DR: In this article, the set of alternative optimal solutions of the basic set covering problem with no other criterion taken into account except that of the whole species-list coverage is generated by the explicit exclusion method.
Abstract: . Avifauna on Greek wetland sites is used as a model for the implementation of the Set Covering Problem in selecting nature reserves. Three site conservation values, which depend on species presence, are used as selection criteria. Their calculation is based upon species richness, species rarity and species-danger status. The conservation values must be inserted in the linear programming problem’s objective function by the form of weighting factors. Optimal solutions according to the three ecological criteria are produced. These solutions belong to the set of alternative optimal solutions of the basic Set Covering Problem with no other criterion taken into account except that of the whole species-list coverage. The set of alternative optimal solutions is generated by the explicit exclusion method. The relative value of goal programming and weighing up-criteria methods in producing a unique solution based on the three criteria simultaneously is assessed. Both methods coincide with the same alternative solution that is thus regarded as the final optimal one incorporating all the three ecological criteria.

Posted Content
18 May 2002
TL;DR: An algorithm based on the relatively simple new idea of randomly rounding variables to smaller-than-integer units is given, to settle the two longstanding open questions about approximation algorithms for solving a general covering integer program.
Abstract: In this paper we study approximation algorithms for solving a general covering integer program. An n-vector x of nonnegative integers is sought, which minimizes c/sup T//spl middot/x, subject to Ax/spl ges/b, x/spl les/d. The entries of A, b, c are nonnegative. Let m be the number of rows of A. Covering problems have been heavily studied in combinatorial optimization. We focus on the effect of the multiplicity constraints, x/spl les/d, on approximately. Two longstanding open questions remain for this general formulation with upper bounds on the variables. (i) The integrality gap of the standard LP relaxation is arbitrarily large. Existing approximation algorithms that achieve the well-known O(log m)-approximation with respect to the LP value do so at the expense of violating the upper bounds on the variables by the same O(log m) multiplicative factor. What is the smallest possible violation of the upper bounds that still achieves cost within O(log m) of the standard LP optimum? (ii) The best known approximation ratio for the problem has been O(log(max/sub j//spl Sigma//sub i/A/sub ij/)) since 1982. This bound can be as bad as polynomial in the input size. Is an O(log m)-approximation, like the one known for the special case of Set Cover, possible? We settle these two open questions. To answer the first question we give an algorithm based on the relatively simple new idea of randomly rounding variables to smaller-than-integer units. To settle the second question we give a reduction from approximating the problem while respecting multiplicity constraints to approximating the problem with a bounded violation of the multiplicity constraints.

Dissertation
01 Jan 2002
TL;DR: This thesis describes a pre-processing stage for the TRACS-I I scheduling system that selects potentially useful relief opportunities from the bus schedule by exploiting useful properties found in good driver schedules, specifically the chaining together of driver duties.
Abstract: The bus driver scheduling problem involves assigning a set of drivers to cover all available bus work such that every bus is assigned a driver, the number of duties is minimised and each duty conforms to the rules governing them regarding maximum driving time and so on. Generally this problem is solved using mathematical programming methods. The University of Leeds has developed a driver scheduling system, TRACS-I I, that solves the bus driver scheduling problem by first generating a large set of potential duties and selecting a subset of these via the associated set covering ILP to form the schedule. The size of the set of potential duties used by TRACS-I I is directly related to the number of relief opportunities present in the original bus schedule. Each relief opportunity potentially serves as a handover point between two bus shifts. A bus schedule containing many relief opportunities can have a very large set of potential shifts generated to cover the buswork. The complexity of the ILP is related to the number of relief opportunities present in the bus schedule. This thesis describes a pre-processing stage for the TRACS-I I scheduling system. The pre-processor selects potentially useful relief opportunities from the bus schedule. The shifts generated by TRACS-I I are restricted to the subset of relief opportunities made available to it by the pre-processor. The reduction of the number of relief opportunities serves to reduce the complexity of the resulting set covering problem by reducing the number of variables and constraints in the ILP. The pre-processor itself uses constraint programming to find several possible ways of selecting relief opportunities from the morning and evening portions of the bus schedule. This is done by exploiting useful properties found in good driver schedules, specifically the chaining together of driver duties such that when one driver takes a break another driver finishing a break continues on the same bus. The pre-processor described has been shown to be effective on a wide variety of schedules in that the minimum number of drivers is almost always used and, in some cases, cheaper schedules can be produced.