scispace - formally typeset
Search or ask a question

Showing papers on "Adjacency list published in 1990"


01 Mar 1990
TL;DR: A new algorithm is proposed to solve the on-line vertex enumeration problem for polytopes, doing all computations in n-space, where n is the dimension of the polytope.

75 citations


Journal ArticleDOI
TL;DR: A Monte Carlo integer programming algorithm developed to generate short-term, spatially feasible timber harvest plans for a New Brunswick Crown license indicates that it is suitable for spatially constrained harvest scheduling on Crown licenses in New Brunswick.
Abstract: A Monte Carlo integer programming algorithm was developed to generate short-term (25-year), spatially feasible timber harvest plans for a New Brunswick Crown license. Solutions for the short-term plan are considered feasible if they meet spatial and temporal harvest-flow and adjacency constraints. The solution search procedure integrates a randomly generated harvesting sequence and checks of harvest-flow and adjacency constraints. The model was used to determine the annual allowable cut under three constraint formulations. The three formulations represented increasing levels of adjacency constraints, from no constraints to levels similar to current provincial requirements. The annual allowable cut under the most strict constraint formulation was reduced by 9% from the unconstrained formulation, for a given mapping strategy of a long-term harvest schedule. These applications of the model indicate that it is suitable for spatially constrained harvest scheduling on Crown licenses in New Brunswick.

72 citations


Book ChapterDOI
01 Jul 1990
TL;DR: It is shown that a maximum flow in a network with n vertices can be computed deterministically in O(n3/log n) time on a uniform-cost RAM, which improves the previous best bound of O( n3).
Abstract: We show that a maximum flow in a network with n vertices can be computed deterministically in O(n3/log n) time on a uniform-cost RAM. For dense graphs, this improves the previous best bound of O(n3).

71 citations


Book ChapterDOI
01 Apr 1990
TL;DR: An image analysis technique in which a separate hierarchy is built over every compact object of the input, made possible by a stochastic decimation algorithm which adapts the structure of the hierarchy to the analyzed image.
Abstract: In this paper we have presented an image analysis technique in which a separate hierarchy is built over every compact object of the input. The approach is made possible by a stochastic decimation algorithm which adapts the structure of the hierarchy to the analyzed image. For labeled images the final description is unique. For gray level images the classes are defined by converging local processes and slight differences may appear. At the apex every root can recover information about the represented object in logirhtmic number of processing steps, and thus the adjacency graph can become the foundation for a reulational model of the scene.

61 citations


Journal ArticleDOI
11 Nov 1990
TL;DR: It is observed that perfect matching is not possible for a matched pair of nets with intersecting horizontal spans, so a technique to achieve almost perfect mirror symmetry is presented for such pairs of nets.
Abstract: A well-defined methodology for mapping the constraints on a set of critical coupling capacitances into constraints in the vertical-constraint (VC) graph of a channel is presented. The approach involves directing undirected edges, adding directed edges, and increasing the weights of edges in the VC graph in order to meet crossover constraints between orthogonal segments and adjacency constraints between parallel segments while attempting to cause minimum increase in the channel height due to the constraints. Use is made of shield nets when necessary. A formal description of the conditions under which the crossover and the adjacency constraints are satisfied is provided and used to construct the appropriate mapping algorithms. The problem of imposing matching constraints on the routing parasitics in a channel with lateral symmetry is addressed. It is observed that perfect matching is not possible for a matched pair of nets with intersecting horizontal spans. A technique to achieve almost perfect mirror symmetry is presented for such pairs of nets. >

60 citations


Book ChapterDOI
01 Mar 1990
TL;DR: This work closes up substantially the gaps between the known lower and upper bounds for these succinct problems, in most cases matching optimally the lower and the upper bound.
Abstract: Highly regular graphs can be represented advantageously by some kind of description shorter than the full adjacency matrix; a natural succinct representation is by means of a boolean circuit computing the adjacency matrix as a boolean function The complexity of the decision problems for several graph-theoretic properties changes drastically when this succinct representation is used to present the input We close up substantially the gaps between the known lower and upper bounds for these succinct problems, in most cases matching optimally the lower and the upper bound

48 citations


Journal ArticleDOI
TL;DR: An heuristic for adjacency constraint aggregation is proposed that is composed of two procedures: identifying harvesting areas for which it is not necessary to wri...
Abstract: An heuristic for adjacency constraint aggregation is proposed. The heuristic is composed of two procedures. Procedure 1 consists of identifying harvesting areas for which it is not necessary to wri...

44 citations


Journal ArticleDOI
TL;DR: The methods of computational geometry such as region decomposition are used not only to decompose a complex region, but also to reflect the boundary grading into the interior the region.
Abstract: A triangular mesh generator is presented which makes extensive use of side swapping and mesh smoothing to create a grid with few obtuse triangles. To perform the above task a data structure is presented which holds full adjacency information both for nodes and elements. It is shown how this data structure is employed in the process mesh generation, and how the methods of computational geometry such as region decomposition are used not only to decompose a complex region, but also to reflect the boundary grading into the interior the region. Algorithms are provided to show the mechanism of these processes and practical examples are given to support the approach.

16 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that it is always possible to construct an orthogonal floorplan satisfying the area and adjacency requirements of any vertex-weighted maximal planar graph which is its dual.
Abstract: It is shown that it is always possible to construct an orthogonal floorplan satisfying the area and adjacency requirements of any vertex-weighted maximal planar graph which is its dual.

15 citations


Journal ArticleDOI
TL;DR: The face adjacency conditions derived in this paper were later used to select the octal value of the rightmost digit that must be appended or deleted from the locational code of the same size neighbour to identify smaller or larger size neighbours of a selected node in the specified direction.

14 citations


Journal ArticleDOI
TL;DR: Two sets of primitive Euler operators are presented, which build and manipulate such representations while maintaining their topological integrity, and the use of such operators is demonstrated in connection with two algorithms for building a Delaunay tetrahedralization.
Abstract: A polyhedral decomposition can be unambiguously described as the collection of four primitive elements (i.e., polyhedra, facets, edges, and vertices) plus their mutual adjacency relations. We consider here the problem of representing a specific kind of polyhedral decomposition, i.e., a tetrahedralization. We describe two different representations for a tetrahedralization. The first one can only model polyhedral decompositions with tetrahedral cells, while the second one is suitable for describing any partition of a volume into polyhedral cells with triangular facets. We present two sets of primitive Euler operators, which build and manipulate such representations while maintaining their topological integrity. The use of such operators is demonstrated in connection with two algorithms for building a Delaunay tetrahedralization, which show the different hedralization, which show the different uses of the two representations.

Journal ArticleDOI
TL;DR: It is proposed that there are significant benefits from applying region adjacency analysis to comparatively raw (‘low-level’) imagery, such as is produced by many remote sensing systems, rather than to highly processed images.
Abstract: The analysis of image data by region adjacency methods is a long-established, though not currently widely-used, method of region analysis. This lack of use is probably because the technique has usually been applied only to images which have been subjected to many stages of processing and which, as a result, contain only a small number of regions whose analysis could easily be achieved manually. The present paper proposes that there are significant benefits from applying region adjacency analysis to comparatively raw (‘low-level’) imagery, such as is produced by many remote sensing systems, rather than to highly processed images. In fact many high-level processing requirements can be better performed by region analysis of the low-level data. Examples include blob extraction, specific region neighbour searching and region merging. The latter operation is extremely important in reducing complex image-derived data to a less complex form suitable for entry into a geographical information system (GIS)....

Proceedings ArticleDOI
04 Oct 1990
TL;DR: An interaction model for object-oriented geographic databases is presented and a browsing technique based on the model is outlined, able to satisfy users interested in getting a general idea about the contents of the database, as well as those with more specific tasks to accomplish.
Abstract: An interaction model for object-oriented geographic databases is presented. A browsing technique based on the model is outlined. The approach is uniform for navigating both the intensional and extensional part of the database. The proposed interaction model is flexible and is suitable for satisfying many purposes. It is able to satisfy users interested in getting a general idea about the contents of the database, as well as those with more specific tasks to accomplish. Depending on their requirements, users may control the degree of complexity of the information presented on the screen. Two basic criteria are adopted for browsing the database: logic adjacency and spatial adjacency between objects. Logic adjacency is determined by four conceptual links, while spatial adjacency is related to the two-dimensional (map-based) view of geographic entities. >

Journal ArticleDOI
TL;DR: A class of adjacency preserving embeddings that map a node in the schema graph into a subcube or into adjacent subcubes of a hypercube are studied, motivated by the technique used for state assignment in asynchronous sequential machines.

Journal ArticleDOI
TL;DR: A hybrid filling technique is proposed, which also processes conflicting adjacency information created by subsampling or digitization errors, to address issues arising in the presence of objects defined partially in Volume and partially in Boundary Representation.

Proceedings ArticleDOI
01 Mar 1990
TL;DR: An algorithm is described which produces the extraction of an object's contours in a binary image while storing only two raster lines at a time during processing, which is a powerful tool for processing large-size engineering drawings.
Abstract: An algorithm is described which produces the extraction of an object's contours in a binary image while storing only two raster lines at a time during processing. The major data structure used is a block adjacency graph which occupies much less space (typically one hundredth) than the original image. Because this algorithm has very low costs of memory space and processing time, it is a powerful tool for processing large-size engineering drawings. Applications of the method to the recognition of some loop-structure logical symbols are described. >

Book ChapterDOI
01 Jan 1990
TL;DR: In this paper, a graph is conceived as a pair (V,ρ) where V is a set and ρ is an irreflexive symmetric relation on V.
Abstract: In this paper by graph we always mean a graph without loops and without multiple lines. Such a graph will be conceived as a pair (V,ρ) where V is a set and ρ is an irreflexive symmetric relation on V. The elements of V are the vertices and ρ represents the adjacency. Concerning other basic concepts we will use the common terminology.

Proceedings ArticleDOI
01 Nov 1990
TL;DR: This paper presents an algorithm for state assignment in incompletely specified finite state machines, based on a set of heuristic rules that are used to build a desired adjacency graph in which a weight is associated to each possible adjacencies.
Abstract: This paper presents an algorithm for state assignment in incompletely specified finite state machines, based on a set of heuristic rules. These rules are used to build a desired adjacency graph in which a weight is associated to each possible adjacency. A new method of assigning codes to each state is presented with the goal of choosing adjacencies with large weights.

Proceedings ArticleDOI
01 Jan 1990
TL;DR: In this paper, the authors developed both spatial and temporal representations for databases to handle the inferring of attribute values as a result of possible queries, and their application to typical geographic/temporal databases are described.
Abstract: A common method for storing knowledge in databases is in the form of attribute values. For databases with spatial and temporal knowledge however, it is not feasible to store all of the attribute values that one might be interested in. For example, in the spatial domain for a geographic database it is impossible to anticipate the very large number of queries regarding nearness, spatial adjacency, or the possibility of finding feasible paths between arbitrary locations. Similarly in the temporal domain it is impossible to anticipate all queries regarding the temporal duration, range, and overlap of complex temporal events. We have developed both spatial and temporal representations for databases to handle the inferring of attribute values as a result of possible queries. The temporal representation involves the creation of time tags for attribute values and information regarding the persistence of those values. The spatial representation consists of a labeled array in which each label corresponds to to a unique object (or class of objects) in the database. Preprocessing of spatial “scenes” allows the system to rapidly obtain paths, determine objects in a given region of interest, etc. These representations, and their application to typical geographic/temporal databases are described in the paper. A natural language interface, developed earlier, was extended to work with these spatial and temporal representations.

Proceedings ArticleDOI
05 Feb 1990
TL;DR: A processor that performs general set operations results, as well as a system that can answer various knowledge base queries and guide a knowledge base search, are described.
Abstract: A new bidirectional optical associative processor is described for searching a hierarchical database that is stored as an adjacency matrix. The paper discusses how the processor can answer relatively complex queries on a knowledge base when the queries are formulated as combinations of set closures, unions, intersections, and complementations. Thus, a processor that performs general set operations results, as well as a system that can answer various knowledge base queries and guide a knowledge base search. These are new operations for associative processors that increase their utility. This new associative processor operates on entities and their attributes. It can be viewed as a type lattice processor (since the entities and attributes form a hierarchy known as a type lattice), as a closure processor (since it performs closure operations that list all attributes of an entity [or entities] or all entities with a given attribute [or attributes]), or as an adjacency processor (since the connection matrix used stores adjacent associations of attributes and entities).

01 Jan 1990
TL;DR: A graph theoretic approach to inexact scene matching is presented which is useful in dealing with problems due to imperfect image segmentation, and a method of handling oversegmentation and undersegmentation problems is presented.
Abstract: The ability to match two scenes is a fundamental requirement in a variety of computer vision tasks. A graph theoretic approach to inexact scene matching is presented which is useful in dealing with problems due to imperfect image segmentation. A scene is described by a set of graphs, with nodes representing objects and arcs representing relationships between objects. Each node has a set of values representing the relations between pairs of objects, such as angle, adjacency, or distance. With this method of scene representation, the task in scene matching is to match two sets of graphs. Because of segmentation errors, variations in camera angle, illumination, and other conditions, an exact match between the sets of observed and stored graphs is usually not possible. In the developed approach, the problem is represented as an association graph, in which each node represents a possible mapping of an observed region to a stored object, and each arc represents the compatibility of two mappings. Nodes and arcs have weights indicating the merit or a region-object mapping and the degree of compatibility between two mappings. A match between the two graphs corresponds to a clique, or fully connected subgraph, in the association graph. The task is to find the clique that represents the best match. Fuzzy relaxation is used to update the node weights using the contextual information contained in the arcs and neighboring nodes. This simplifies the evaluation of cliques. A method of handling oversegmentation and undersegmentation problems is also presented. The approach is tested with a set of realistic images which exhibit many types of sementation errors.

Proceedings ArticleDOI
01 Apr 1990
TL;DR: In this paper, the use of a Boltzmann machine to search for the shortest path in a directed graph (digraph) whose edge weights are equal is discussed, and an example application of the proposed method employing a 10-node digraph is demonstrated.
Abstract: The use of a Boltzmann machine to search for the shortest path in a directed graph (digraph) whose edge weights are equal is discussed. The adjacency matrix of the digraph is employed in the Boltzmann machine topology such that each entry in the adjacency matrix corresponds to a computation node of the Boltzmann machine. The quadratic performance function, for which the Boltzmann machine finds minima, is defined using the syntactic constraints that a path specification has to satisfy. An example application of the proposed method employing a 10-node digraph is demonstrated. >

Proceedings ArticleDOI
01 Jul 1990
TL;DR: A hierarchical data structure which can be mapped into a computer architecture that will efficiently store, manipulate, and display time varying images of multi-dimensional biomedical structures is proposed.
Abstract: Biomedical structures such as the beating heart are inherently multi-dimensional in nature. In addition to the three spatial directions which represent the object location and orientation, higher order dimensions can be assigned to represent various object parameters such as time and tissue density. In this paper, we propose a hierarchical data structure which can be mapped into a computer architecture that will efficiently store, manipulate, and display time varying images of multi-dimensional biomedical structures. This n-D object representation scheme which is called a linear hypertree is a generalization of the linear quadtree and octree from their respective 2-D and 3-D spaces to n-D environment. It is a hierarchical data structure which represents multi-dimensional volumetric information in a 2'-way branching tree. The basic properties of a linear hypertree are briefly presented along with the procedure for encoding the node rectangular coordinates into a hierarchical locational code. Two decoding techniques that transform the node locational code into its rectangular coordinate format are introduced. Some adjacency concepts in a multi-dimensional environment are defined. A neighbor finding algorithm which identifies the locational code of the adjacent hypertree node in a given direction is also presented. This algorithm does not convert the locational code to its rectangular coordinate form; instead, it operates directly on the node locational code in order to determine the neighbor's identification. Finally, Procedures for computing the locational codes of larger and smaller size neighbors are also included.© (1990) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Journal ArticleDOI
Mingzuo Shen1
TL;DR: A comparison of Sinanoglu's VIF and generalized graphs is presented in this paper, where a broad scheme of graph spectral theory is incorporated into the broad scheme and a history of generalized graphs in theoretical chemistry is given.
Abstract: A comparison of Sinanoglu's VIF (Ref. 1) and generalized graph is presented. Generalized graphs have vertex and edge weights. An abridged history of generalized graphs in theoretical chemistry is given. VIF's are generalized graphs and therefore have adjacency matrices. The “graphical” rules of Sinanoǧlu can be represented by congruent transformations on the adjacency matrix. Thus the method of Sinanoǧlu is incorporated into the broad scheme of graph spectral theory. If the signature of a graph is defined as the collection of the number of positive, zero, and negative eigenvalues of the graph's adjacency matrix, then it is identical to the all-important {n+, n0, n−}, the {number of positive, zero, and negative loops of a reduced graph} or the {number of bonding, nonbonding, and antibonding MOs}. A special case of the Sinanoglu rules is the “multiplication of a vertex” by (−1). In matrix language, this multiplication is an orthogonal transformation of the adjacency matrix. Thus, one can multiply any vertex of a generalized graph by −1 without changing its eigenvalues.

Journal ArticleDOI
TL;DR: A dual-based simplex method for cases of the transhipment problem, which include the shortest paths problem, is presented that finds an optimal solution in no more than $\text{Min}( | E | - | N | + 1, |N | | N - 1 |/2 )$ pivots.
Abstract: The dual linear programs for the transhipment problem over a directed graph, $G = \{ N,E\} $, are shown to have polyhedra with properties that make them well suited to vertex visiting solution techniques, like the simplex method. In particular, nondegenerate cases are shown to have feasible regions with considerably fewer extreme points than the feasible sets for primal problems. The adjacency structure of feasible bases is also shown to be quite favorable. In fact, the Hirsch Conjecture is valid when the network is complete. A dual-based simplex method for cases of the transhipment problem, which include the shortest paths problem, is presented that finds an optimal solution in no more than $\text{Min}( | E | - | N | + 1, | N | | N - 1 |/2 )$ pivots.

Journal ArticleDOI
TL;DR: In this paper, the concept of lexically ordered adjacency matrix of a graph is introduced and it is proved that every adjACency matrix is isomorphic to at least one lexical ordered matrix.
Abstract: The concept of lexically ordered adjacency matrix of a graph is introduced and it is proved that every adjacency matrix is isomorphic to at least one lexically ordered adjacency matrix. An algorithm for the classification of strongly regular graphs is developed, where the property of lexical ordering is used as a means to reduce the number of generated adjacency matrices. We also describe other pruning methods that can be used.

Proceedings ArticleDOI
16 Jun 1990
TL;DR: A kind of tree structure called extended binary tree (EBT) is presented to represent the line adjacency graph (LAG) in order to reduce the computational complexities of LAG-based algorithms in binary image processing.
Abstract: A kind of tree structure called extended binary tree (EBT) is presented to represent the line adjacency graph (LAG) in order to reduce the computational complexities of LAG-based algorithms in binary image processing. The traversal and the storage of the EBT are discussed. Applications of the structure in engineering drawing entry are shown. >


Proceedings ArticleDOI
01 Jan 1990
TL;DR: Rocks can be effectively used as landmarks for robot navigation through rocky terrains for the robot to be able to automatically build models of rocks, and a method for automatically building such models is discussed.
Abstract: Rocks can be effectively used as landmarks for robot navigation through rocky terrains. For the robot to do this, it has to be able to automatically build models of rocks. For the rocks world, models containing qualitatively described surfaces are used. A rock is modeled as a graph. The surface of the rock is decomposed into surface patches separated by crude edges. Each surface patch is represented by a node in the graph. The arcs represent the adjacency relationships between the surface patches. To build such a model, the following approach is taken. For a scene composing of a single approximately convex object, easily distinguishable from its background, a silhouette of the object is obtained. The silhouette is partitioned into crude segments according to the general shape of the segments. Each segment is typed as either concave, convex or straight. The classification is done by measuring the mean and standard deviation of distances of points of the segment from the straight line joining its ends. The qualitative model for the rock is built by initially assuming that the silhouette is a cross-sectional view of the rock. A simple cyclic graph composing of nodes with surface types consistent with the segment types is built. Thus a five segment silhoutte composing of three convex, a straight and a concave segment results in a graph with 5 nodes, three of which are convex surfaces, one flat (corresponding to the straight silhouette segment) and the other concave. The model is improved by moving the camera to a different position and obtaining another silhouette. From the positions of the camera and the segment types, either new nodes are created or the surface types of the currently existing nodes are modified. A method for automatically building such models is discussed.© (1990) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Book ChapterDOI
01 Jan 1990
TL;DR: Graphs are an unconstrained structure where each object may have zero, one or many ‘next’ and ‘previous’ objects and generality adds an extra degree of freedom in structuring data to relate to the structure of real-world situations.
Abstract: In chapter 3 we introduced linear data types as a collection of objects where each object, in general, could have one ‘next’ object and one ‘previous’ object. Trees were non-linear data types where the above restriction was relaxed and each object could have more than one ‘next’ object (children of a node) but, at most, only one ‘previous’ object (parent of a node). We can further generalise tree structures by allowing an object to have more than one ‘previous’ object. A graph is such an unconstrained structure where each object may have zero, one or many ‘next’ and ‘previous’ objects. This generality obviously adds an extra degree of freedom in structuring data to relate to the structure of real-world situations.