scispace - formally typeset
Search or ask a question

Showing papers on "Set cover problem published in 1990"


Journal ArticleDOI
TL;DR: The set covering problem (SCP) as discussed by the authors is the problem of covering the rows of a m row, n column, zero-one matrix by a subset of the columns at minimum cost.
Abstract: The set covering problem (SCP) is the problem of covering the rows of a m row, n column, zero-one matrix by a subset of the columns at minimum cost.

277 citations


Journal ArticleDOI
TL;DR: An algorithm for a mixed set covering/partitioning model that includes as special cases the well-known set covering problem and set partitioning problem is presented.
Abstract: We present an algorithm for a mixed set covering/partitioning model that includes as special cases the well-known set covering problem and set partitioning problem. The novel feature of our algorit...

271 citations


Journal ArticleDOI
TL;DR: Two new models derived from the probabilistic location set covering problem allow the examination of the relationships between the number of facilities being located, the reliability that a vehicle will be available, and a coverage standard.
Abstract: In the last several years, the modeling of emergency vehicle location has focussed on the temporal availability of the vehicles. Vehicles are not available for service when they are engaged in earlier calls. To incorporate this dynamic aspect into facility location decisions, models have been developed which provide additional levels of coverage. In this paper, two new models are derived from the probabilistic location set covering problem. These models allow the examination of the relationships between the number of facilities being located, the reliability that a vehicle will be available, and a coverage standard. In addition, these models incorporate sectoral specific estimates of the availability of the vehicles. Solution of these models reveals that the use of sectoral estimates leads to facility locations which are distributed to a greater spatial extent over the region to be serviced.

159 citations


Dissertation
01 Jan 1990
TL;DR: This thesis shows that for several large and important classes of computational problems, however, a weaker and less expensive form of randomness--namely, k-wise independent distributions--will suffice, and provides the best known deterministic NC approximation algorithms for set discrepancy, weighted discrepancy, lattice approximation, edge coloring, and other problems.
Abstract: During the past two decades, randomness has emerged as a central tool in the development of computational procedures For example, there are many well known problems for which randomness can be used to design an algorithm that works well with high probability on all inputs, whereas the best known deterministic algorithms fail miserably on worst-case inputs Unfortunately, generating truly random bits can be very difficult and/or expensive This thesis shows that for several large and important classes of computational problems, however, a weaker and less expensive form of randomness--namely, k-wise independent distributions--will suffice As a consequence, we are able to devise improved algorithms and cryptographic protocols for a wide variety of problems The most important applications of this work are listed below (1) We obtain the first efficient parallel approximation algorithm for set cover This linear processor, NC algorithm obtains a performance guarantee within a (1 + $\epsilon$) factor of the best sequential algorithm, while achieving a near-optimal speedup This algorithm has applications in many areas, including parallel learning theory (2) We provide a general technique for removing randomness from parallel algorithms that depend on up to polylogarithmic independence The technique substantially generalizes the benefit function framework of Luby to include functions which are a sum of a polynomial number of arbitrary functions of $O$(log $n$) boolean variables each Special cases of polylogarithmic variable functions and multivalued random variables are also considered As applications of these techniques, we provide the best known deterministic NC approximation algorithms for set discrepancy, weighted discrepancy, lattice approximation, edge coloring, and other problems (3) We give the first construction of a secure signature scheme that is based solely on the existence of one-way functions (where by secure we mean secure against existential forgery under adaptive chosen message attack) This improves upon the best previously known construction which requires the existence of a one-way permutation Our construction is, in fact, optimal: the existence of one-way functions is both a necessary and sufficient condition for the existence of secure signature schemes As part of constructing a signature scheme, we show how to construct a family of one-way hash functions given any one-way function (Copies available exclusively from MIT Libraries, Rm 14-0551, Cambridge, MA 02139-4307 Ph 617-253-5668; Fax 617-253-1690)

19 citations


Proceedings ArticleDOI
08 Apr 1990
TL;DR: The purpose of this thesis is to explore the methods used to parallelize NP-complete problems and the degree of improvement that can be realized using a distributed parallel processor to solve these combinatoric problems.
Abstract: : The purpose of this thesis is to explore the methods used to parallelize NP-complete problems and the degree of improvement that can be realized using a distributed parallel processor to solve these combinatoric problems. Common NP-complete problem characteristics such as a priori reductions, use of partial-state information, and inhomogeneous searches are identified and studied. The set covering problem (SCP) is implemented for this research because many applications such as information retrieval, task scheduling, and VLSI expression simplification can be structured as an SCP problem. In addition, its generic NP-complete common characteristics are well documented and a parallel implementation has not been reported. Parallel programming design techniques involve decomposing the problem and developing the parallel algorithms. The major components of a parallel solution are developed in a four phase process. First, a meta-level design is accomplished using an appropriate design language such as UNITY. Then, the UNITY design is transformed into an algorithm and implementation specific to a distributed architecture. Finally, a complexity analysis of the algorithm is performed. the a priori reductions are divided-and-conquer algorithms; whereas, the search for the optimal set cover is accomplished with a branch-and-bound algorithm. The search utilizes a global best cost maintained at a central location for distribution to all processors. Three methods of load balancing are implemented and studied: coarse grain with static allocation of the search space, fine grain with dynamic allocation, and dynamic load balancing.

4 citations


Journal ArticleDOI
01 Mar 1990
TL;DR: It is concluded that it is quite feasible to solve the set cover problem by using supercomputers and an efficient vectorized algorithm is presented, which was proved to be NP-complete on a supercomputer, ETA10-Q108.
Abstract: Supercomputers, such as CRAY-1, CRAY X-MP, CYBER 205, ETA10, … etc, have been regularly used for solving numerical problems. It is very rare that supercomputers are used to solve combinatorial problems. In this paper, we present an efficient vectorized algorithm to solve the set cover problem, which was proved to be NP-complete, on a supercomputer, ETA10-Q108. This algorithm fully utilizes vector instructions. Experiments are performed both on ETA10-Q108 and VAX/8550 for comparison. It takes VAX/8550 1174.5 seconds to solve a set of problem instances while it takes ETA10-Q108 only 26.6 seconds to solve the same set of problems. For a problem instance involving 7000 elements in a set, it takes 47.74 seconds for the supercomputer to solve it. If VAX/8550 is used, it will need roughly 15 hours. Thus we conclude that it is quite feasible to solve the set cover problem by using supercomputers.

3 citations


Book ChapterDOI
01 Jan 1990
TL;DR: An assignment relaxation for the set covering problem (SCP) is introduced and a tree search method is developed which makes use of this relaxation and computational experience of processing a collection of test problems.
Abstract: An assignment relaxation for the set covering problem (SCP) is introduced and discussed. A tree search method is then developed which makes use of this relaxation. Computational experience of processing a collection of test problems is reported. The work reported here constitutes a part of a generalised tree search method for the solution of the SCP which is described in [11]

1 citations


Journal ArticleDOI
TL;DR: Two set-covering algorithms are proposed and proved, one selecting and ordering the covering candidates by degrees and the other using lexicographic ordering.
Abstract: Two set-covering algorithms are proposed and proved, one selecting and ordering the covering candidates by degrees and the other using lexicographic ordering.