scispace - formally typeset
Search or ask a question
Author

William J. Cook

Bio: William J. Cook is an academic researcher from Georgia Institute of Technology. The author has contributed to research in topics: Travelling salesman problem & Combinatorial optimization. The author has an hindex of 34, co-authored 86 publications receiving 10380 citations. Previous affiliations of William J. Cook include University of Pittsburgh & Rutgers University.


Papers
More filters
Book
04 Feb 2007
TL;DR: Open Library features a library with books from the Internet Archive and lists them in the open library and gives you access to over 1 million free e-Books and the ability to search using subject, title and author.
Abstract: This book presents the latest findings on one of the most intensely investigated subjects in computational mathematics--the traveling salesman problem. It sounds simple enough: given a set of cities and the cost of travel between each pair of them, the problem challenges you to find the cheapest route by which to visit all the cities and return home to where you began. Though seemingly modest, this exercise has inspired studies by mathematicians, chemists, and physicists. Teachers use it in the classroom. It has practical applications in genetics, telecommunications, and neuroscience. The authors of this book are the same pioneers who for nearly two decades have led the investigation into the traveling salesman problem. They have derived solutions to almost eighty-six thousand cities, yet a general solution to the problem has yet to be discovered. Here they describe the method and computer code they used to solve a broad range of large-scale problems, and along the way they demonstrate the interplay of applied mathematics with increasingly powerful computing platforms. They also give the fascinating history of the problem--how it developed, and why it continues to intrigue us.

1,737 citations

Journal ArticleDOI
TL;DR: The optimization procedure, combining the heuristic method and the combinatorial branch and bound algorithm, solved the well-known 10×10 problem of J. F. Thomson in under 7 minutes of computation time on a Sun Sparcstation 1.
Abstract: The job-shop scheduling problem is a notoriously difficult problem in combinatorial optimization. Although even modest sized instances remain computationally intractable, a number of important algorithmic advances have been made in recent years by J. Adams, E. Balas and D. Zawack; J. Carlier and E. Pinson; B. J. Lageweg, J. K. Lenstra and A. H. G. Rinnooy Kan; and others. Making use of a number of these advances, we have designed and implemented a new heuristic procedure for finding schedules, a cutting-plane method for obtaining lower bounds, and a combinatorial branch and bound algorithm. Our optimization procedure, combining the heuristic method and the combinatorial branch and bound algorithm, solved the well-known 10×10 problem of J. F. Muth and G. L. Thomson in under 7 minutes of computation time on a Sun Sparcstation 1. INFORMS Journal on Computing, ISSN 1091-9856, was published as ORSA Journal on Computing from 1989 to 1995 under ISSN 0899-1499.

849 citations

Book
15 Jan 2007
TL;DR: The authors are the same pioneers who for nearly two decades have led the investigation into the traveling salesman problem, and here they describe the method and computer code they used to solve a broad range of large-scale problems, and along the way they demonstrate the interplay of applied mathematics with increasingly powerful computing platforms.
Abstract: This book presents the latest findings on one of the most intensely investigated subjects in computational mathematics--the traveling salesman problem. It sounds simple enough: given a set of cities and the cost of travel between each pair of them, the problem challenges you to find the cheapest route by which to visit all the cities and return home to where you began. Though seemingly modest, this exercise has inspired studies by mathematicians, chemists, and physicists. Teachers use it in the classroom. It has practical applications in genetics, telecommunications, and neuroscience.The authors of this book are the same pioneers who for nearly two decades have led the investigation into the traveling salesman problem. They have derived solutions to almost eighty-six thousand cities, yet a general solution to the problem has yet to be discovered. Here they describe the method and computer code they used to solve a broad range of large-scale problems, and along the way they demonstrate the interplay of applied mathematics with increasingly powerful computing platforms. They also give the fascinating history of the problem--how it developed, and why it continues to intrigue us.

517 citations

Journal ArticleDOI
TL;DR: A comprehensive overview of the state of the art in the relevant fields of research, summarize important open problems, and lay out a roadmap for future progress can be found in this article, where the authors present a concise, yet comprehensive overview.
Abstract: The grand challenges of contemporary fundamental physics---dark matter, dark energy, vacuum energy, inflation and early universe cosmology, singularities and the hierarchy problem---all involve gravity as a key component. And of all gravitational phenomena, black holes stand out in their elegant simplicity, while harbouring some of the most remarkable predictions of General Relativity: event horizons, singularities and ergoregions. The hitherto invisible landscape of the gravitational Universe is being unveiled before our eyes: the historical direct detection of gravitational waves by the LIGO-Virgo collaboration marks the dawn of a new era of scientific exploration. Gravitational-wave astronomy will allow us to test models of black hole formation, growth and evolution, as well as models of gravitational-wave generation and propagation. It will provide evidence for event horizons and ergoregions, test the theory of General Relativity itself, and may reveal the existence of new fundamental fields. The synthesis of these results has the potential to radically reshape our understanding of the cosmos and of the laws of Nature. The purpose of this work is to present a concise, yet comprehensive overview of the state of the art in the relevant fields of research, summarize important open problems, and lay out a roadmap for future progress.

407 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This work presents a new coarsening heuristic (called heavy-edge heuristic) for which the size of the partition of the coarse graph is within a small factor of theSize of the final partition obtained after multilevel refinement, and presents a much faster variation of the Kernighan--Lin (KL) algorithm for refining during uncoarsening.
Abstract: Recently, a number of researchers have investigated a class of graph partitioning algorithms that reduce the size of the graph by collapsing vertices and edges, partition the smaller graph, and then uncoarsen it to construct a partition for the original graph [Bui and Jones, Proc. of the 6th SIAM Conference on Parallel Processing for Scientific Computing, 1993, 445--452; Hendrickson and Leland, A Multilevel Algorithm for Partitioning Graphs, Tech. report SAND 93-1301, Sandia National Laboratories, Albuquerque, NM, 1993]. From the early work it was clear that multilevel techniques held great promise; however, it was not known if they can be made to consistently produce high quality partitions for graphs arising in a wide range of application domains. We investigate the effectiveness of many different choices for all three phases: coarsening, partition of the coarsest graph, and refinement. In particular, we present a new coarsening heuristic (called heavy-edge heuristic) for which the size of the partition of the coarse graph is within a small factor of the size of the final partition obtained after multilevel refinement. We also present a much faster variation of the Kernighan--Lin (KL) algorithm for refining during uncoarsening. We test our scheme on a large number of graphs arising in various domains including finite element methods, linear programming, VLSI, and transportation. Our experiments show that our scheme produces partitions that are consistently better than those produced by spectral partitioning schemes in substantially smaller time. Also, when our scheme is used to compute fill-reducing orderings for sparse matrices, it produces orderings that have substantially smaller fill than the widely used multiple minimum degree algorithm.

5,629 citations

Journal ArticleDOI
TL;DR: This paper compares the running times of several standard algorithms, as well as a new algorithm that is recently developed that works several times faster than any of the other methods, making near real-time performance possible.
Abstract: Minimum cut/maximum flow algorithms on graphs have emerged as an increasingly useful tool for exactor approximate energy minimization in low-level vision. The combinatorial optimization literature provides many min-cut/max-flow algorithms with different polynomial time complexity. Their practical efficiency, however, has to date been studied mainly outside the scope of computer vision. The goal of this paper is to provide an experimental comparison of the efficiency of min-cut/max flow algorithms for applications in vision. We compare the running times of several standard algorithms, as well as a new algorithm that we have recently developed. The algorithms we study include both Goldberg-Tarjan style "push -relabel" methods and algorithms based on Ford-Fulkerson style "augmenting paths." We benchmark these algorithms on a number of typical graphs in the contexts of image restoration, stereo, and segmentation. In many cases, our new algorithm works several times faster than any of the other methods, making near real-time performance possible. An implementation of our max-flow/min-cut algorithm is available upon request for research purposes.

4,463 citations

Book ChapterDOI
03 Sep 2001
TL;DR: The goal of this paper is to provide an experimental comparison of the efficiency of min-cut/max flow algorithms for applications in vision, comparing the running times of several standard algorithms, as well as a new algorithm that is recently developed.
Abstract: After [10, 15, 12, 2, 4] minimum cut/maximum flow algorithms on graphs emerged as an increasingly useful tool for exact or approximate energy minimization in low-level vision. The combinatorial optimization literature provides many min-cut/max-flow algorithms with different polynomial time complexity. Their practical efficiency, however, has to date been studied mainly outside the scope of computer vision. The goal of this paper is to provide an experimental comparison of the efficiency of min-cut/max flow algorithms for energy minimization in vision. We compare the running times of several standard algorithms, as well as a new algorithm that we have recently developed. The algorithms we study include both Goldberg-style "push-relabel" methods and algorithms based on Ford-Fulkerson style augmenting paths. We benchmark these algorithms on a number of typical graphs in the contexts of image restoration, stereo, and interactive segmentation. In many cases our new algorithm works several times faster than any of the other methods making near real-time performance possible.

3,099 citations

Proceedings ArticleDOI
22 Jun 2001
TL;DR: The development of a new complete solver, Chaff, is described which achieves significant performance gains through careful engineering of all aspects of the search-especially a particularly efficient implementation of Boolean constraint propagation (BCP) and a novel low overhead decision strategy.
Abstract: Boolean satisfiability is probably the most studied of the combinatorial optimization/search problems. Significant effort has been devoted to trying to provide practical solutions to this problem for problem instances encountered in a range of applications in electronic design automation (EDA), as well as in artificial intelligence (AI). This study has culminated in the development of several SAT packages, both proprietary and in the public domain (e.g. GRASP, SATO) which find significant use in both research and industry. Most existing complete solvers are variants of the Davis-Putnam (DP) search algorithm. In this paper we describe the development of a new complete solver, Chaff which achieves significant performance gains through careful engineering of all aspects of the search-especially a particularly efficient implementation of Boolean constraint propagation (BCP) and a novel low overhead decision strategy. Chaff has been able to obtain one to two orders of magnitude performance improvement on difficult SAT benchmarks in comparison with other solvers (DP or otherwise), including GRASP and SATO.

2,886 citations