scispace - formally typeset
Search or ask a question
Author

George S. Lueker

Other affiliations: Alcatel-Lucent
Bio: George S. Lueker is an academic researcher from University of California, Irvine. The author has contributed to research in topics: Bin packing problem & Probabilistic analysis of algorithms. The author has an hindex of 18, co-authored 38 publications receiving 3382 citations. Previous affiliations of George S. Lueker include Alcatel-Lucent.

Papers
More filters
Journal ArticleDOI
TL;DR: The consecutive ones test for the consecutive ones property in matrices and for graph planarity is extended to a test for interval graphs using a recently discovered fast recognition algorithm for chordal graphs.

1,622 citations

Journal ArticleDOI
TL;DR: In this paper, it was shown that for any positive e, there exists an O(n)-time algorithmS such that, if S(L) denotes the number of bins used by S for L, thenS(L)/L*≦1+e for anyL provided L* is sufficiently large.
Abstract: For any listL ofn numbers in (0, 1) letL* denote the minimum number of unit capacity bins needed to pack the elements ofL. We prove that, for every positive e, there exists anO(n)-time algorithmS such that, ifS(L) denotes the number of bins used byS forL, thenS(L)/L*≦1+e for anyL providedL* is sufficiently large.

550 citations

Journal ArticleDOI
TL;DR: It is shown that for a somewhat larger class of graphs, namely the chordal graphs, isomorphism is as hard as for general graphs.
Abstract: Author(s): Lueker, George S.; Booth, Kellogg S. | Abstract: A graph is an interval graph if and only if each of its vertices can be associated with an interval on the real line in such a way that two vertices are adjacent in the graph exactly when the corresponding intervals have a nonempty intersection. An efficient algorithm for testing isomorphism of interval graphs is implemented using a data structure called a PQ-tree. The algorithm runs in 0(n + e) steps for graphs having n vertices and e edges. It is shown that for a somewhat larger class of graphs, namely the chordal graphs, isomorphism is as hard as for general graphs.

214 citations

Journal ArticleDOI
TL;DR: A transformation is presented that enables range restrictions to be added to an arbitrary dynamic data structure on n elements, provided that the problem satisfies a certain decomposability condition and that one is willing to allow increases by a factor of O in the worst-case time for an operation and in the space used.
Abstract: A database is said to allow range restrictions if one may request that only records with some specified field in a specified range be considered when answering a given query. A transformation is presented that enables range restrictions to be added to an arbitrary dynamic data structure on n elements, provided that the problem satisfies a certain decomposability condition and that one is willing to allow increases by a factor of O(log n) in the worst-case time for an operation and in the space used. This is a generalization of a known transformation that works for static structures. This transformation is then used to produce a data structure for range queries in k dimensions with worst-case times of O(logk n) for each insertion, deletion, or query operation.

205 citations

Book
01 Apr 1991
TL;DR: Bin Packing: Heuristics, Scheduling and Partitioning, and Packings in Two Dimensions.
Abstract: Analysis Techniques. Matching Problems. Scheduling and Partitioning. Bin Packing: The Optimum Solution. Bin Packing: Heuristics. Packings in Two Dimensions. References. Index.

169 citations


Cited by
More filters
Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI
TL;DR: Monocle is described, an unsupervised algorithm that increases the temporal resolution of transcriptome dynamics using single-cell RNA-Seq data collected at multiple time points that revealed switch-like changes in expression of key regulatory factors, sequential waves of gene regulation, and expression of regulators that were not known to act in differentiation.
Abstract: Defining the transcriptional dynamics of a temporal process such as cell differentiation is challenging owing to the high variability in gene expression between individual cells. Time-series gene expression analyses of bulk cells have difficulty distinguishing early and late phases of a transcriptional cascade or identifying rare subpopulations of cells, and single-cell proteomic methods rely on a priori knowledge of key distinguishing markers. Here we describe Monocle, an unsupervised algorithm that increases the temporal resolution of transcriptome dynamics using single-cell RNA-Seq data collected at multiple time points. Applied to the differentiation of primary human myoblasts, Monocle revealed switch-like changes in expression of key regulatory factors, sequential waves of gene regulation, and expression of regulators that were not known to act in differentiation. We validated some of these predicted regulators in a loss-of function screen. Monocle can in principle be used to recover single-cell gene expression kinetics from a wide array of cellular processes, including differentiation, proliferation and oncogenic transformation.

4,119 citations

Book
01 Jan 1983
TL;DR: The basis of this book is the material contained in the first six chapters of the earlier work, The Design and Analysis of Computer Algorithms, and has added material on algorithms for external storage and memory management.
Abstract: From the Publisher: This book presents the data structures and algorithms that underpin much of today's computer programming. The basis of this book is the material contained in the first six chapters of our earlier work, The Design and Analysis of Computer Algorithms. We have expanded that coverage and have added material on algorithms for external storage and memory management. As a consequence, this book should be suitable as a text for a first course on data structures and algorithms. The only prerequisite we assume is familiarity with some high-level programming language such as Pascal.

2,690 citations

Book
01 Jan 2003
TL;DR: Rina Dechter synthesizes three decades of researchers work on constraint processing in AI, databases and programming languages, operations research, management science, and applied mathematics to provide the first comprehensive examination of the theory that underlies constraint processing algorithms.
Abstract: Constraint satisfaction is a simple but powerful tool. Constraints identify the impossible and reduce the realm of possibilities to effectively focus on the possible, allowing for a natural declarative formulation of what must be satisfied, without expressing how. The field of constraint reasoning has matured over the last three decades with contributions from a diverse community of researchers in artificial intelligence, databases and programming languages, operations research, management science, and applied mathematics. Today, constraint problems are used to model cognitive tasks in vision, language comprehension, default reasoning, diagnosis, scheduling, temporal and spatial reasoning. In Constraint Processing, Rina Dechter, synthesizes these contributions, along with her own significant work, to provide the first comprehensive examination of the theory that underlies constraint processing algorithms. Throughout, she focuses on fundamental tools and principles, emphasizing the representation and analysis of algorithms. ·Examines the basic practical aspects of each topic and then tackles more advanced issues, including current research challenges ·Builds the reader's understanding with definitions, examples, theory, algorithms and complexity analysis ·Synthesizes three decades of researchers work on constraint processing in AI, databases and programming languages, operations research, management science, and applied mathematics Table of Contents Preface; Introduction; Constraint Networks; Consistency-Enforcing Algorithms: Constraint Propagation; Directional Consistency; General Search Strategies; General Search Strategies: Look-Back; Local Search Algorithms; Advanced Consistency Methods; Tree-Decomposition Methods; Hybrid of Search and Inference: Time-Space Trade-offs; Tractable Constraint Languages; Temporal Constraint Networks; Constraint Optimization; Probabilistic Networks; Constraint Logic Programming; Bibliography

1,739 citations

Journal ArticleDOI
TL;DR: It is proved that no MAX SNP-hard problem has a polynomial time approximation scheme, unless NP = P, and there exists a positive ε such that approximating the maximum clique size in an N-vertex graph to within a factor of Nε is NP-hard.
Abstract: We show that every language in NP has a probablistic verifier that checks membership proofs for it using logarithmic number of random bits and by examining a constant number of bits in the proof. If a string is in the language, then there exists a proof such that the verifier accepts with probability 1 (i.e., for every choice of its random string). For strings not in the language, the verifier rejects every provided “proof” with probability at least 1/2. Our result builds upon and improves a recent result of Arora and Safra [1998] whose verifiers examine a nonconstant number of bits in the proof (though this number is a very slowly growing function of the input length).As a consequence, we prove that no MAX SNP-hard problem has a polynomial time approximation scheme, unless NP = P. The class MAX SNP was defined by Papadimitriou and Yannakakis [1991] and hard problems for this class include vertex cover, maximum satisfiability, maximum cut, metric TSP, Steiner trees and shortest superstring. We also improve upon the clique hardness results of Feige et al. [1996] and Arora and Safra [1998] and show that there exists a positive e such that approximating the maximum clique size in an N-vertex graph to within a factor of Ne is NP-hard.

1,501 citations