scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A fast algorithm for the maximum clique problem

15 Aug 2002-Discrete Applied Mathematics (Elsevier Science Publishers B. V.)-Vol. 120, Iss: 1, pp 197-207
TL;DR: A branch-and-bound algorithm for the maximum clique problem--which is computationally equivalent to the maximum independent (stable) set problem--is presented with the vertex order taken from a coloring of the vertices and with a new pruning strategy.
About: This article is published in Discrete Applied Mathematics.The article was published on 2002-08-15 and is currently open access. It has received 645 citations till now. The article focuses on the topics: Independent set & Chordal graph.
Citations
More filters
Journal ArticleDOI
19 Jun 2014-Nature
TL;DR: In this paper, a connection between the onset of quantum contextuality and the possibility of universal quantum computation via "magic state distillation" was made. But the connection was not made in the context of quantum computing.
Abstract: Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via ‘magic state’ distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple ‘hidden variable’ model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms. Quantum computing promises advantages over classical computing for certain problems; now ‘quantum contextuality’ — a generalization of the concept of quantum non-locality — is shown to be a critical resource that gives the most promising class of quantum computers their power. It is widely appreciated that quantum computing promises advantages over classical computing in certain circumstances and for certain problems. But what are the specific features of quantum mechanics that are ultimately responsible for this enhanced potential? Mark Howard and colleagues identify 'quantum contextuality' — a generalization of the concept of quantum non-locality — as the critical resource that gives quantum computers their power. This finding not only provides clarification of the theoretical basis of quantum computing, it also provides a framework for directing experimental efforts to most effectively harness the weirdness of quantum mechanics for computational tasks.

524 citations

31 Dec 1994
TL;DR: A partially enumerative algorithm is presented for the maximum clique problem which is very simple to implement and Computational results for an efficient implementation on an IBM 3090 computer are provided.
Abstract: We present an exact partial enumerative algorithm for the maximum clique problem. The pruning device used is derived from graph colorings. Pruning of the search tree is accomplished not only by the number of colors used to color a tree subproblem but also by using information gained in the process of coloring. This leads to increased pruning which translates into improved computational performance. Experimental results on test problems are presented.

467 citations

Journal ArticleDOI
TL;DR: These algorithms use three different methods for determining the number of edge-disjoint embeddings of a subgraph and employ novel algorithms for candidate generation and frequency counting, which allow them to operate on datasets with different characteristics and to quickly prune unpromising subgraphs.
Abstract: Graph-based modeling has emerged as a powerful abstraction capable of capturing in a single and unified framework many of the relational, spatial, topological, and other characteristics that are present in a variety of datasets and application areas. Computationally efficient algorithms that find patterns corresponding to frequently occurring subgraphs play an important role in developing data mining-driven methodologies for analyzing the graphs resulting from such datasets. This paper presents two algorithms, based on the horizontal and vertical pattern discovery paradigms, that find the connected subgraphs that have a sufficient number of edge-disjoint embeddings in a single large undirected labeled sparse graph. These algorithms use three different methods for determining the number of edge-disjoint embeddings of a subgraph and employ novel algorithms for candidate generation and frequency counting, which allow them to operate on datasets with different characteristics and to quickly prune unpromising subgraphs. Experimental evaluation on real datasets from various domains show that both algorithms achieve good performance, scale well to sparse input graphs with more than 120,000 vertices or 110,000 edges, and significantly outperform previously developed algorithms.

444 citations


Cites methods from "A fast algorithm for the maximum cl..."

  • ...In this study, we used a fast implementation of the exact maximum clique (MC) problem solver wclique (Östergård, 2002) instead of those fast exact MIS algorithms....

    [...]

  • ...In this study, we used a fast implementation of the exact maximum clique (MC) problem solver wclique (Östergård, 2002) instead of those fast exact MIS algorithms....

    [...]

Journal ArticleDOI
TL;DR: The proposed MCES algorithm is based on a maximum clique formulation of the problem and is a significant improvement over other published algorithms and presents new approaches to both lower and upper bounding as well as vertex selection.
Abstract: A new graph similarity calculation procedure is introduced for comparing labeled graphs. Given a minimum similarity threshold, the procedure consists of an initial screening process to determine whether it is possible for the measure of similarity between the two graphs to exceed the minimum threshold, followed by a rigorous maximum common edge subgraph (MCES) detection algorithm to compute the exact degree and composition of similarity. The proposed MCES algorithm is based on a maximum clique formulation of the problem and is a significant improvement over other published algorithms. It presents new approaches to both lower and upper bounding as well as vertex selection.

327 citations


Cites methods from "A fast algorithm for the maximum cl..."

  • ...The unique branching procedure of the Ostergard algorithm precludes the use of a lower-bounding procedure for the size of the maximum clique....

    [...]

  • ...To further ensure uniformity, all four algorithms were run in conjunction with the costvector screening procedure previously presented as well as the minimum similarity index (MSI) lower bound for the size of the maximum clique, with the exception of the Ostergard algorithm which was run using only the costvector screening....

    [...]

  • ...These are the THE COMPUTER JOURNAL, Vol. 45, No. 6, 2002 MC1 algorithm of Wood [31] employing a greedy coloring upper bound and the Ostergard [39] algorithm....

    [...]

  • ...MC1 algorithm of Wood [31] employing a greedy coloring upper bound and the Ostergard [39] algorithm....

    [...]

Journal ArticleDOI
TL;DR: This paper presents a new exact algorithm for the Capacitated Vehicle Routing Problem (CVRP) based on the set partitioning formulation with additional cuts that correspond to capacity and clique inequalities by combining three dual ascent heuristics.
Abstract: This paper presents a new exact algorithm for the Capacitated Vehicle Routing Problem (CVRP) based on the set partitioning formulation with additional cuts that correspond to capacity and clique inequalities. The exact algorithm uses a bounding procedure that finds a near optimal dual solution of the LP-relaxation of the resulting mathematical formulation by combining three dual ascent heuristics. The first dual heuristic is based on the q-route relaxation of the set partitioning formulation of the CVRP. The second one combines Lagrangean relaxation, pricing and cut generation. The third attempts to close the duality gap left by the first two procedures using a classical pricing and cut generation technique. The final dual solution is used to generate a reduced problem containing only the routes whose reduced costs are smaller than the gap between an upper bound and the lower bound achieved. The resulting problem is solved by an integer programming solver. Computational results over the main instances from the literature show the effectiveness of the proposed algorithm.

313 citations


Cites methods from "A fast algorithm for the maximum cl..."

  • ...The CLIQUER package is composed of a set of C routines for finding cliques in an arbitrary weighted graph based on the exact branch-and-bound algorithm developed by Östergård [40]....

    [...]

References
More filters
Book
01 Jan 1979
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Abstract: This is the second edition of a quarterly column the purpose of which is to provide a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NP-Completeness,’’ W. H. Freeman & Co., San Francisco, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed. Readers having results they would like mentioned (NP-hardness, PSPACE-hardness, polynomial-time-solvability, etc.), or open problems they would like publicized, should send them to David S. Johnson, Room 2C355, Bell Laboratories, Murray Hill, NJ 07974, including details, or at least sketches, of any new proofs (full papers are preferred). In the case of unpublished results, please state explicitly that you would like the results mentioned in the column. Comments and corrections are also welcome. For more details on the nature of the column and the form of desired submissions, see the December 1981 issue of this journal.

40,020 citations

Book ChapterDOI
TL;DR: A survey of results concerning algorithms, complexity, and applications of the maximum clique problem is presented and enumerative and exact algorithms, heuristics, and a variety of other proposed methods are discussed.
Abstract: The maximum clique problem is a classical problem in combinatorial optimization which finds important applications in different domains. In this paper we try to give a survey of results concerning algorithms, complexity, and applications of this problem, and also provide an updated bibliography. Of course, we build upon precursory works with similar goals [39, 232, 266].

1,065 citations

Journal ArticleDOI
TL;DR: In this paper, a partially enumerative algorithm for the maximum clique problem is presented, which is very simple to implement on an IBM 3090 computer and can be used to generate graphs with up to 3000 vertices and over one million edges.

476 citations