scispace - formally typeset
Search or ask a question

Showing papers on "Graph (abstract data type) published in 1994"


Journal ArticleDOI
01 Nov 1994-Science
TL;DR: This experiment demonstrates the feasibility of carrying out computations at the molecular level by solving an instance of the directed Hamiltonian path problem with standard protocols and enzymes.
Abstract: The tools of molecular biology were used to solve an instance of the directed Hamiltonian path problem. A small graph was encoded in molecules of DNA, and the "operations" of the computation were performed with standard protocols and enzymes. This experiment demonstrates the feasibility of carrying out computations at the molecular level.

4,266 citations


Book ChapterDOI
16 Jun 1994
TL;DR: Triple graph grammars are intended to fill the gap and to support the specification of interdependencies between graph-like data structures on a very high level.
Abstract: Data integration is a key issue for any integrated set of software tools A typical CASE environment, for instance, offers tools for the manipulation of requirements and software design documents, and it provides more or less sophisticated assistance for keeping these documents in a consistent state Up to now, almost all data consistency observing or preserving integration tools are hand-crafted due to the lack of generic implementation frameworks and the absence of adequate specification formalisms Triple graph grammars are intended to fill this gap and to support the specification of interdependencies between graph-like data structures on a very high level Furthermore, they are the fundamentals of a new machinery for the production of batch-oriented as well as incrementally working data integration tools

799 citations


Journal ArticleDOI
TL;DR: A survey of existing methods of communication in usual networks, particularly the complete network, the ring, the torus, the grid, the hypercube, the cube connected cycles, the undirected de Bruijn graph, the stargraph, the shuffle-exchange graph, and the butterfly graph.

398 citations


Journal ArticleDOI
TL;DR: A spectral approach to multi-way ratio-cut partitioning that provides a generalization of the ratio- cut cost metric to L-way partitioning and a lower bound on this cost metric is developed.
Abstract: Recent research on partitioning has focused on the ratio-cut cost metric, which maintains a balance between the cost of the edges cut and the sizes of the partitions without fixing the size of the partitions a priori. Iterative approaches and spectral approaches to two-way ratio-cut partitioning have yielded higher quality partitioning results. In this paper, we develop a spectral approach to multi-way ratio-cut partitioning that provides a generalization of the ratio-cut cost metric to L-way partitioning and a lower bound on this cost metric. Our approach involves finding the k smallest eigenvalue/eigenvector pairs of the Laplacian of the graph. The eigenvectors provide an embedding of the graph's n vertices into a k-dimensional subspace. We devise a time and space efficient clustering heuristic to coerce the points in the embedding into k partitions. Advancement over the current work is evidenced by the results of experiments on the standard benchmarks. >

394 citations


Journal ArticleDOI
TL;DR: It is shown that finding the maximum a-posteriori probability (MAP) instantiation of all the random variables given the evidence is NP-hard in the general case when graph representations are used, even if the size of the representation happens to be linear in n.

379 citations



Journal ArticleDOI
TL;DR: The cognitive structure of graphics is examined and a structural classification of visual representations of graphs and images is reported to report, if visualization is to continue to advance as an interdisciplinary science, it must become more than a grab bag of techniques for displaying data.
Abstract: hy du we often prefer glancing at a graph to studying a table of numbers? What might be a better graphic than either a graph or table for seeing how a biological process unfolds with time? To begin to answer these kinds of questions we examine the cognitive structure of graphics and report a structural classification of visual representadnns. McCormick, DeFami, and Brown [16] define visualization as \" the study of mechanisms in computers and in humans which allow them in concert to perceive , use, and communicate visual information. \" Thus, visualization includes the study of both image synthesis and image understanding. Given this broad focus, it is not surprising that visualization spans many academic disciplines, scientific fields, and multiple domains of inquiry. However, if visualization is to continue to advance as an interdisciplinary science, it must become more than a grab bag of techniques for displaying data. Our research focuses on classifying visual information. Classification lies at the heart of every scientific field. Classifications structure domains of systematic inquiry and provide concepu for developing theories to identify anomalies and to predict future research needs. Extant lawonomies of graphs and images can be characterized as either iimc-tional or structural. Functional taxonomies focus on the intended use and purpose of the graphic material. For example, consider the functional classification developed by Macdonald-Ross [ 141. One of the main categories is lechniurl dim pm used for maintaining, operating, and troubleshooting complex equip ment. Other examples of functional classifications can be found in Tufte [ZI. A functional classification does not reflect the physical structure of images, nor is it intended to correspond to an underlfing representation in memory 111. In contrast, structural categories are well learned and are derived from exem-plar learning. They focus on the form of the image rather than its content. Rankin [18] and Bertin [Z] developed such structural categories of graphs. Rankin used the number of dimensions and graph forms to determine his clas

305 citations


Journal ArticleDOI
TL;DR: This paper investigates the planning of assembly algorithms specifying (dis) assembly operations on the components of a product and the ordering of these operations and presents measures to evaluate the complexity of these algorithms and techniques to estimate the inherent complexity of aproduct.

297 citations


Journal ArticleDOI
TL;DR: In this article, the authors generalized Freeman's geodesic centrality measures for betweenness on undirected graphs to the more general directed case, and defined a unique maximally centralized graph for directed graphs, holding constant the numbers of points with reciprocatable (incoming and outgoing) versus only unreciprocate (outgoing only or incoming only) arcs.

276 citations


Proceedings Article
12 Sep 1994
TL;DR: Query optimization which is done by making a graph of the query and moving predicates around in the graph so that they will be applied early in the optimized query generated from the graph.
Abstract: Query optimization which is done by making a graph of the query and moving predicates around in the graph so that they will be applied early in the optimized query generated from the graph. Predicates are first propagated up from child nodes of the graph to parent nodes and then down into different child nodes. After the predicates have been moved, redundant predicates are detected and removed. Predicates are moved through aggregation operations and new predicates are deduced from aggregation operations and from functional dependencies. The optimization is not dependent on join order and works where nodes of the graph cannot be merged.

253 citations


Journal ArticleDOI
TL;DR: Graph-theoretic methods for the representation and searching of three-dimensional patterns of side-chains in protein structures are discussed, demonstrating that the search algorithm can successfully retrieve the great majority of the expected proteins, as well as other, previously unreported proteins that contain the pattern of interest.

Book ChapterDOI
29 Jul 1994
TL;DR: An approach to the solution of decision problems formulated as influence diagrams involves a special triangulation of the underlying graph, the construction of a junction tree with special properties, and a message passing algorithm operating on the junction tree for computation of expected utilities and optimal decision policies.
Abstract: We present an approach to the solution of decision problems formulated as influence diagrams This approach involves a special triangulation of the underlying graph, the construction of a junction tree with special properties, and a message passing algorithm operating on the junction tree for computation of expected utilities and optimal decision policies

Journal ArticleDOI
TL;DR: A heuristic for the multiple tour maximum collection problem is developed that has the ability to handle large problems in a very short amount of computation time and will always produce a feasible solution if one exists.


Journal ArticleDOI
TL;DR: A graph-oriented object database model (GOOD) is introduced as a theoretical basis for database systems in which manipulation as well as conceptual representation of data is transparently graph-based.
Abstract: A graph-oriented object database model (GOOD) is introduced as a theoretical basis for database systems in which manipulation as well as conceptual representation of data is transparently graph-based. In the GOOD model, the scheme as well as the instance of an object database is represented by a graph, and the data manipulation is expressed by graph transformations. These graph transformations are described using five basic operations and a method construct, all with a natural semantics. The basic operations add and delete objects and edges as a function of the matchings of a pattern. The expressiveness of the model in terms of object-oriented modeling and data manipulation power is investigated. >

Journal ArticleDOI
TL;DR: A distributed self-stabilizing Depth-First Search (DFS) spanning tree algorithm, whose output is a DFS spanning tree of the communication graph, kept in a distributed fashion.

Journal ArticleDOI
TL;DR: This paper gives some integer programming formulations for the Steiner tree problem on undirected and directed graphs and study the associated polyhedra and gives some families of facets for the undirecting case along with some compositions and extensions.
Abstract: In this paper we give some integer programming formulations for the Steiner tree problem on undirected and directed graphs and study the associated polyhedra. We give some families of facets for the undirected case along with some compositions and extensions. We also give a projection that relates the Steiner tree polyhedron on an undirected graph to the polyhedron for the corresponding directed graph. This is used to show that the LP-relaxation of the directed formulation is superior to the LP-relaxation of the undirected one.

Journal ArticleDOI
TL;DR: A polynomial-time algorithm for testing if a triconnected directed graph has an upward drkwing is presented, based on a new combinatorial characterization that maps the problem into a max-flow problem on a sparse network.
Abstract: A polynomial-time algorithm for testing if a triconnected directed graph has an upward drkwing is presented An upward drkwing is a planar drkwing such that all the edges flow in a common direction (eg, from bottom to top) The problem arises in the fields of automatic graph drkwing and ordered sets, and has been open for several years The proposed algorithm is based on a new combinatorial characterization that maps the problem into a max-flow problem on a sparse network; the time complexity isO(n+r 2) , wheren is the number of vertices andr is the number of sources and sinks of the directed graph If the directed graph has an upward drkwing, the algorithm allows us to construct one easily

Proceedings ArticleDOI
01 Dec 1994
TL;DR: 'Chopping' is defined, a generalization of slicing that can express most of its variants, and it is shown that, using the dependence graph, it produces more accurate results than algorithms based directly on the PDG.
Abstract: A dependence model for reverse engineering should treat procedures in a modular fashion and should be fine-grained, distinguishing dependences that are due to different variables. The program dependence graph (PDG) satisfies neither of these criteria. We present a new form of dependence graph that satisfies both, while retaining the advantages of the PDG: it is easy to construct and allows program slicing to be implemented as a simple graph traversal. We define 'chopping', a generalization of slicing that can express most of its variants, and show that, using our dependence graph, it produces more accurate results than algorithms based directly on the PDG.

Journal ArticleDOI
TL;DR: A set of constraints is identified which gives rise to a class of tractable problems and given polynomial time algorithms for solving such problems, and it is proved that the class of problems generated by any set of constraint not contained in this restricted set is NP-complete.

Journal ArticleDOI
TL;DR: The algorithm's completeness ensures that the adaptation algorithm will eventually search the entire graph and its systematicity ensures that it will do so without redundantly searching any parts of the graph.
Abstract: The paradigms of transformational planning, case-based planning, and plan debugging all involve a process known as plan adaptation -- modifying or repairing an old plan so it solves a new problem. In this paper we provide a domain-independent algorithm for plan adaptation, demonstrate that it is sound, complete, and systematic, and compare it to other adaptation algorithms in the literature. Our approach is based on a view of planning as searching a graph of partial plans. Generative planning starts at the graph's root and moves from node to node using plan-refinement operators. In planning by adaptation, a library plan--an arbitrary node in the plan graph--is the starting point for the search, and the plan-adaptation algorithm can apply both the same refinement operators available to a generative planner and can also retract constraints and steps from the plan. Our algorithm's completeness ensures that the adaptation algorithm will eventually search the entire graph and its systematicity ensures that it will do so without redundantly searching any parts of the graph.

Book ChapterDOI
07 Nov 1994
TL;DR: An extension to the TAM model is proposed to deal efficiently with authorization schemes involving sets of privileges and can be useful to identify which privilege transfers can lead to unsafe protection states.
Abstract: In this paper, an extension to the TAM model is proposed to deal efficiently with authorization schemes involving sets of privileges. This new formalism provides a technique to analyse the safety problem for this kind of schemes and can be useful to identify which privilege transfers can lead to unsafe protection states. Further extensions are suggested towards quantitative evaluation of operational security and intrusion detection.

Journal ArticleDOI
TL;DR: A general method for space variant image processing, based on a connectivity graph which represents the neighbor-relations in an arbitrarily structured sensor, which is suitable for real-time implementation, and provides a generic solution to a wide range of image processing applications with space variant sensors.
Abstract: This paper describes a graph-based approach to image processing, intended for use with images obtained from sensors having space variant sampling grids. The connectivity graph (CG) is presented as a fundamental framework for posing image operations in any kind of space variant sensor. Partially motivated by the observation that human vision is strongly space variant, a number of research groups have been experimenting with space variant sensors. Such systems cover wide solid angles yet maintain high acuity in their central regions. Implementation of space variant systems pose at least two outstanding problems. First, such a system must be active, in order to utilize its high acuity region; second, there are significant image processing problems introduced by the non-uniform pixel size, shape and connectivity. Familiar image processing operations such as connected components, convolution, template matching, and even image translation, take on new and different forms when defined on space variant images. The present paper provides a general method for space variant image processing, based on a connectivity graph which represents the neighbor-relations in an arbitrarily structured sensor. We illustrate this approach with the following applications: (1) Connected components is reduced to its graph theoretic counterpart. We illustrate this on a logmap sensor, which possesses a difficult topology due to the branch cut associated with the complex logarithm function. (2) We show how to write local image operators in the connectivity graph that are independent of the sensor geometry. (3) We relate the connectivity graph to pyramids over irregular tessalations, and implement a local binarization operator in a 2-level pyramid. (4) Finally, we expand the connectivity graph into a structure we call a transformation graph, which represents the effects of geometric transformations in space variant image sensors. Using the transformation graph, we define an efficient algorithm for matching in the logmap images and solve the template matching problem for space variant images. Because of the very small number of pixels typical of logarithmic structured space variant arrays, the connectivity graph approach to image processing is suitable for real-time implementation, and provides a generic solution to a wide range of image processing applications with space variant sensors.

Journal ArticleDOI
TL;DR: This paper study the problem of linking a set of transceiver stations in a visibility-connected communication network, by placing a minimum number of relays on the terrain surface, and proposes a practical approximate solution based on a Steiner heuristic.
Abstract: Line-of-sight communication on topographic surfaces has relevance for several applications of Geographical Information Systems. In this paper, we study the problem of linking a set of transceiver stations in a visibility-connected communication network, by placing a minimum number of relays on the terrain surface. The problem is studied in the framework of a discrete visibility model, where the mutual visibility of a finite set of sites on the terrain is represented through a graph, called the visibility graph. While in the special case of only two transceivers an optimal solution can be found in polynomial time, by computing a minimum path on the visibility graph, the general problem is equivalent to a Steiner problem on the visibility graph, and, thus, it is untractable in practice. In the latter case, we propose a practical approximate solution based on a Steiner heuristic. For both the special and the general case, we propose both a static and a dynamic algorithm that allow computation of a s...

Journal ArticleDOI
TL;DR: In this article, a complete revision of the size strength graph is proposed, based on more than 100 case studies, with particular emphasis on rock masses which can be broken up solely by mechanical means such as ripping.
Abstract: A ‘size-strength’ graph, subdivided for various methods of excavation, was published by Franklin, Broch & Walton in 1971. Although this graph allows excavatability to be assessed rapidly, the subdivisions have become outdated as more powerful, more efficient equipment has become available. A complete revision of the graphical method, based on more than 100 case studies, is proposed. Particular emphasis is placed on rock masses which can be broken up solely by mechanical means such as ripping. Procedures are recommended for obtaining geotechnical data, and the importance of interpreting field observations in their geomorphological context is discussed. Consideration is also given to the effects of block shape and orientation on ripper performance, and adjustments to the input data are suggested. The revised graph is intended for all types of project but especially for road construction, where plant mobility and flexibility are important.

Book ChapterDOI
10 Oct 1994
TL;DR: This paper shows that upward planarity testing and rectilinear planar testing are NP-complete problems, and shows that it is NP-hard to approximate the minimum number of bends in a planar orthogonal drawing of an n-vertex graph with an O(n1−∈) error.
Abstract: A directed graph is upward planar if it can be drawn in the plane such that every edge is a monotonically increasing curve in the vertical direction, and no two edges cross. An undirected graph is rectilinear planar if it can be drawn in the plane such that every edge is a horizontal or vertical segment, and no two edges cross. Testing upward planarity and rectilinear planarity are fundamental problems in the effective visualization of various graph and network structures. In this paper we show that upward planarity testing and rectilinear planarity testing are NP-complete problems. We also show that it is NP-hard to approximate the minimum number of bends in a planar orthogonal drawing of an n-vertex graph with an O(n1−∈) error, for any ∈>0.

Book ChapterDOI
29 Jul 1994
TL;DR: In this paper, a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph) is presented.
Abstract: The paper presents a method for reducing the computational complexity of Bayesian networks through identification and removal of weak dependences (removal of links from the (moralized) independence graph). The removal of a small number of links may reduce the computational complexity dramatically, since several fill-ins and moral links may be rendered superfluous by the removal. The method is described in terms of impact on the independence graph, the junction tree, and the potential functions associated with these. An empirical evaluation of the method using large real-world networks demonstrates the applicability of the method. Further, the method, which has been implemented in Hugin, complements the approximation method suggested by Jensen & Andersen (1990).

Journal ArticleDOI
TL;DR: This three-part series of articles describes one approach to synthesis of solutions to a class of mechanical design problems, which involve transmission and transformation of mechanical forces and motion, and can be described by a set of inputs and outputs.
Abstract: Conceptual design is an early phase in the design process, which involves the generation of solution concepts to satisfy the functional requirements of a design problem. There can be more than one solution to a problem; this means that there is scope for producing improved designs if one could explore a solution space larger than is possible at present. Computer support to conceptual design could be effective to this end, if an adequate understanding of the required design knowledge and subsequent tools for its representation and manipulation were available. This three-part series of articles describes one approach to synthesis of solutions to a class of mechanical design problems; these involve transmission and transformation of mechanical forces and motion, and can be described by a set of inputs and outputs. The approach involves(1) identifying a set of primary functional elements and rules of combining them, and(2) developing appropriate representations and reasoning procedures for synthesising solution concepts using these elements and their combination rules; these synthesis procedures can produce an exhaustive set of solution concepts, in terms of their topological as well as spatial configurations, to a given design problem. Part I provides an overview of the scope and the approach, adopted in the entire series, to identify the design knowledge required for synthesis, and a method for its validation. It specifically focuses on the extraction and representation of this knowledge. Part II describes synthesis of topological (graph structure) descriptions of possible solutions to a given problem. Part III describes a procedure for producing spatial configurations of these solutions.

Book ChapterDOI
09 May 1994
TL;DR: The concept of multipermutation is proposed, which is a pair of orthogonal latin squares, as a new cryptographic primitive that generalizes the boxes of the FFT, which determines the minimal depth of FFT-compression networks for collision-resistant hashing.
Abstract: Black box cryptanalysis applies to hash algorithms consisting of many small boxes, connected by a known graph structure, so that the boxes can be evaluated forward and backwards by given oracles. We study attacks that work for any choice of the black boxes, i.e. we scrutinize the given graph structure. For example we analyze the graph of the fast Fourier transform (FFT). We present optimal black box inversions of FFT-compression functions and black box constructions of collisions. This determines the minimal depth of FFT-compression networks for collision-resistant hashing. We propose the concept of multipermutation, which is a pair of orthogonal latin squares, as a new cryptographic primitive that generalizes the boxes of the FFT. Our examples of multipermutations are based on the operations circular rotation, bitwise xor, addition and multiplication.

Patent
Wing Y. Au1
28 Mar 1994
TL;DR: In this paper, a disk drive command queue reordering method includes calculation of least-latency and accounts for dependencies of I/O commands in the queue to avoid data hazards with respect to reordering of I /O commands.
Abstract: A disk drive command queue reordering method includes calculation of least-latency and accounts for dependencies of I/O commands in the queue to avoid data hazards with respect to reordering of I/O commands. Command queue representation is augmented by a graph structure representing dependencies of queue commands. Computational overhead associated with constructing, maintaining, and executing graph flow analysis relative to the dependency graph is sufficiently low to allow interleaved operation with higher level disk drive functions such as timely interaction with the host device. The disclosure includes a method of calculating and maintaining the dependency information in a command queue and using this information to constrain command reordering in a time and computationally efficient manner.