scispace - formally typeset
Search or ask a question

Showing papers on "Delaunay triangulation published in 2004"


Proceedings ArticleDOI
02 Jun 2004
TL;DR: This paper tackles the problem of computing topological invariants of geometric objects in a robust manner, using only point cloud data sampled from the object, and produces a nested family of simplicial complexes, which represent the data at different feature scales, suitable for calculating persistent homology.
Abstract: This paper tackles the problem of computing topological invariants of geometric objects in a robust manner, using only point cloud data sampled from the object. It is now widely recognised that this kind of topological analysis can give qualitative information about data sets which is not readily available by other means. In particular, it can be an aid to visualisation of high dimensional data. Standard simplicial complexes for approximating the topological type of the underlying space (such as Cech, Rips, or a-shape) produce simplicial complexes whose vertex set has the same size as the underlying set of point cloud data. Such constructions are sometimes still tractable, but are wasteful (of computing resources) since the homotopy types of the underlying objects are generally realisable on much smaller vertex sets. We obtain smaller complexes by choosing a set of 'landmark' points from our data set, and then constructing a "witness complex" on this set using ideas motivated by the usual Delaunay complex in Euclidean space. The key idea is that the remaining (non-landmark) data points are used as witnesses to the existence of edges or simplices spanned by combinations of landmark points. Our construction generalises the topology-preserving graphs of Martinetz and Schulten [MS94] in two directions. First, it produces a simplicial complex rather than a graph. Secondly it actually produces a nested family of simplicial complexes, which represent the data at different feature scales, suitable for calculating persistent homology [ELZ00, ZC04]. We find that in addition to the complexes being smaller, they also provide (in a precise sense) a better picture of the homology, with less noise, than the full scale constructions using all the data points. We illustrate the use of these complexes in qualitatively analyzing a data set of 3 × 3 pixel patches studied by David Mumford et al [LPM03].

394 citations


Proceedings ArticleDOI
08 Jul 2004
TL;DR: A noise-resistant algorithm for reconstructing a watertight surface from point cloud data that forms a Delaunay tetrahedralization, then uses a variant of spectral graph partitioning to decide whether each tetrahedron is inside or outside the original object.
Abstract: We introduce a noise-resistant algorithm for reconstructing a watertight surface from point cloud data. It forms a Delaunay tetrahedralization, then uses a variant of spectral graph partitioning to decide whether each tetrahedron is inside or outside the original object. The reconstructed surface triangulation is the set of triangular faces where inside and outside tetrahedra meet. Because the spectral partitioner makes local decisions based on a global view of the model, it can ignore outliers, patch holes and undersampled regions, and surmount ambiguity due to measurement errors. Our algorithm can optionally produce a manifold surface. We present empirical evidence that our implementation is substantially more robust than several closely related surface reconstruction programs.

285 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate whether a very short arc may still contain signican t orbit information, with predictive value, e.g., allowing to compute useful ephemerides with a well dened uncertainty for some time in the future.
Abstract: Most asteroid discoveries consist of a few astrometric observations over a short time span, and in many cases the amount of information is not enough to compute a full orbit according to the least squares principle. We investigate whether such a Very Short Arc may nonetheless contain signican t orbit information, with predictive value, e.g., allowing to compute useful ephemerides with a well dened uncertainty for some time in the future. For short enough arcs, all the signican t information is contained in an at- tributable, consisting of two angles two angular velocities for a given time; an appar- ent magnitude is also often available. In this case, no information on the geocentric range r and range-rate _ r is available from the observations themselves. However, the values of (r; _ r) are constrained to a compact subset, the admissible region, if we can assume that the discovered object belongs to the Solar System, is not a satellite of the Earth and is not a shooting star (very small and very close). We give a full algebraic description of the admissible region, including geometric properties like the presence of not more than two connected components. The admissible region can be sampled by selecting a nite number of points in the (r; _ r) plane, each corresponding to a full set of six initial conditions (given the four component attributable) for the asteroid orbit. Because the admissible region is a region in the plane, it can be described by a triangulation with the selected points as nodes. We show that triangulations with optimal properties, such as the Delaunay triangulations, can be generated by an eectiv e algorithm; however, the optimal triangulation depends upon the choice of a metric in the (r; _ r) plane. Each node of the triangulation is a Virtual Asteroid, for which it is possible to propagate the orbit and predict ephemerides. Thus for each time there is an image triangulation on the celestial sphere, and it can be used in a way similar to the use of the nominal ephemerides (with their condence regions) in the classical case of a full least square orbit.

191 citations


01 Jan 2004
TL;DR: The Delaunay triangulation, in both classic and more generalized sense, is studied in this paper for minimizing the linear interpolation error (measure in L^P-norm) for a given function.
Abstract: The Delaunay triangulation, in both classic and more generalized sense, is studied in this paper for minimizing the linear interpolation error (measure in L^P-norm) for a given function. The classic Delaunay triangulation can then be characterized as an optimal triangulation that minimizes the interpolation error for the isotropic function ‖x‖^2 among all the triangulations with a given set of vertices. For a more general function, a functiondependent Delaunay triangulation is then defined to be an optimal triangulation that minimizes the interpolation error for this function and its construction can be obtained by a simple lifting and projection procedure. The optimal Delaunay triangulation is the one that minimizes the interpolation error among all triangulations with the same number of vertices, i.e. the distribution of vertices are optimized in order to minimize the interpolation error. Such a function-depend entoptimal Delaunay triangulation is proved to exist for any given convex continuous function.On an optimal Delaunay triangulation associated with f, it is proved that △↓f at the interior vertices can be exactly recovered by the function values on its neighboring vertices.Since the optimal Delaunay triangulation is difficult to obtain in practice, the concept of nearly optimal triangulation is introduced and two sufficient conditions are presented for a triangulation to be nearly optimal.

163 citations


Journal ArticleDOI
TL;DR: The methodology described brings together a number of well-developed theories/techniques, including graph theory, Delaunay triangulation, the Voronoi diagram, urban morphology and Gestalt theory, in such a way that multiscale products can be derived.
Abstract: Building generalization is a difficult operation due to the complexity of the spatial distribution of buildings and for reasons of spatial recognition. In this study, building generalization is decomposed into two steps, i.e. building grouping and generalization execution. The neighbourhood model in urban morphology provides global constraints for guiding the global partitioning of building sets on the whole map by means of roads and rivers, by which enclaves, blocks, superblocks or neighbourhoods are formed; whereas the local constraints from Gestalt principles provide criteria for the further grouping of enclaves, blocks, superblocks and/or neighbourhoods. In the grouping process, graph theory, Delaunay triangulation and the Voronoi diagram are employed as supporting techniques. After grouping, some useful information, such as the sum of the building's area, the mean separation and the standard deviation of the separation of buildings, is attached to each group. By means of the attached information, an ...

162 citations


Journal ArticleDOI
TL;DR: This work considers online routing algorithms for routing between the vertices of embedded planar straight line graphs and presents two deterministic memoryless routing algorithms and a randomized memoryless algorithm that works for all triangulations.
Abstract: We consider online routing algorithms for routing between the vertices of embedded planar straight line graphs. Our results include (1) two deterministic memoryless routing algorithms, one that works for all Delaunay triangulations and the other that works for all regular triangulations; (2) a randomized memoryless algorithm that works for all triangulations; (3) an O(1) memory algorithm that works for all convex subdivisions; (4) an O(1) memory algorithm that approximates the shortest path in Delaunay triangulations; and (5) theoretical and experimental results on the competitiveness of these algorithms.

153 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a new sparse planar structure called partial Delaunay triangulation (PDT), which can be constructed locally and is denser than other known localized planar structures.
Abstract: We address the problem of localized scatternet formation for multihop Bluetooth-based personal area ad hoc networks. Nodes are assumed to know their positions and are able to establish connections with any of their neighboring nodes, located within their transmission radius, in the neighbor discovery phase. The next phase of the proposed formation algorithm is optional and can be applied to construct a sparse geometric structure in a localized manner. We propose here a new sparse planar structure, namely, partial Delaunay triangulation (PDT), which can be constructed locally and is denser than other known localized planar structures. In the next mandatory phase, the degree of each node is limited to seven by applying the Yao structure, and the master-slave relations in piconets are formed in created subgraphs. This phase consists of several iterations. In each iteration, undecided nodes with higher keys than any of their undecided neighbors apply the Yao structure to bound the degrees, decide master-slave relations on the remaining edges, and inform all neighbors about either deleting edges or master-slave decisions. To the best of our knowledge, our schemes are the first schemes that construct degree limited (a node has at most seven slaves) and connected piconets in multihop networks, without parking any node. The creation and maintenance require small overhead in addition to maintaining accurate location information for one-hop neighbors. The experiments confirm good functionality of created Bluetooth networks in addition to their fast creation and straightforward maintenance.

149 citations



Journal ArticleDOI
TL;DR: A new greedy algorithm for surface reconstruction from unorganized point sets that achieves topologically correct reconstruction in most cases and can handle surfaces with complex topology, boundaries, and nonuniform sampling.
Abstract: In this paper, we present a new greedy algorithm for surface reconstruction from unorganized point sets. Starting from a seed facet, a piecewise linear surface is grown by adding Delaunay triangles one by one. The most plausible triangles are added first and in such a way as to prevent the appearance of topological singularities. The output is thus guaranteed to be a piecewise linear orientable manifold, possibly with boundary. Experiments show that this method is very fast and achieves topologically correct reconstruction in most cases. Moreover, it can handle surfaces with complex topology, boundaries, and nonuniform sampling.

133 citations


01 Jan 2004
TL;DR: The computational cost of proposed new mesh smoothing schemes in the isotropic case is as low as Laplacian smoothing while the error-based mesh quality is provably improved.
Abstract: We present several mesh smoothing schemes based on the concept of optimal Delaunay triangulations. We define the optimal Delaunay triangulation (ODT) as the triangulation that minimizes the interpolation error among all triangulations with the same number of vertices. ODTs aim to equidistribute the edge length under a new metric related to the Hessian matrix of the approximated function. Therefore we define the interpolation error as the mesh quality and move each node to a new location, in its local patch, that reduces the interpolation error. With several formulas for the interpolation error, we derive a suitable set of mesh smoothers among which Laplacian smoothing is a special case. The computational cost of proposed new mesh smoothing schemes in the isotropic case is as low as Laplacian smoothing while the error-based mesh quality is provably improved. Our mesh smoothing schemes also work well in the anisotropic case.

129 citations


Journal ArticleDOI
TL;DR: The TOPOFIT method helps to detect conformational changes, topological differences in variable parts, which are particularly important for studies of variations in active/ binding sites and protein classification.
Abstract: Similarity of protein structures has been analyzed using three-dimensional Delaunay triangulation patterns derived from the backbone representation. It has been found that structurally related proteins have a common spatial invariant part, a set of tetrahedrons, mathematically described as a common spatial subgraph volume of the three-dimensional contact graph derived from Delaunay tessellation (DT). Based on this property of protein structures, we present a novel common volume superimposition (TOPOFIT) method to produce structural alignments. Structural alignments usually evaluated by a number of equivalent (aligned) positions (Ne) with corresponding root mean square deviation (RMSD). The superimposition of the DT patterns allows one to uniquely identify a maximal common number of equivalent residues in the structural alignment. In other words, TOPOFIT identifies a feature point on the RMSD Ne curve, a topomax point, until which the topologies of two structures correspond to each other, including backbone and interresidue contacts, whereas the growing number of mismatches between the DT patterns occurs at larger RMSD (Ne) after the topomax point. It has been found that the topomax point is present in all alignments from different protein structural classes; therefore, the TOPOFIT method identifies common, invariant structural parts between proteins. The alignments produced by the TOPOFIT method have a good correlation with alignments produced by other current methods. This novel method opens new opportunities for the comparative analysis of protein structures and for more detailed studies on understanding the molecular principles of tertiary structure organization and functionality. The TOPOFIT method also helps to detect conformational changes, topological differences in variable parts, which are particularly important for studies of variations in active/ binding sites and protein classification.

01 Jan 2004
TL;DR: In this article, the authors provide a case study of what can go wrong and why with a simple algorithm for computing convex hulls in the plane and a Delaunay triangulation in space.
Abstract: The algorithms of computational geometry are designed for a machine model with exact real arithmetic. Substituting floating-point arithmetic for the assumed real arithmetic may cause implementations to fail. Although this is well known, there are no concrete examples with a comprehensive documentation of what can go wrong and why. In this paper, we provide a case study of what can go wrong and why. For our study, we have chosen two simple algorithms which are often taught, an algorithm for computing convex hulls in the plane and an algorithm for computing Delaunay triangulations in space. We give examples that make the algorithms fail in many different ways. We also show how to construct such examples systematically and discuss the geometry of the floating-point implementation of the orientation predicate. We hope that our work will be useful for teaching computational geometry.

Journal ArticleDOI
TL;DR: A hybrid approach to accurate quantification of vascular structures from magnetic resonance angiography (MRA) images using level set methods and deformable geometric models constructed with 3-D Delaunay triangulation is presented.
Abstract: The aim of this paper is to present a hybrid approach to accurate quantification of vascular structures from magnetic resonance angiography (MRA) images using level set methods and deformable geometric models constructed with 3-D Delaunay triangulation. Multiple scale filtering based on the analysis of local intensity structure using the Hessian matrix is used to effectively enhance vessel structures with various diameters. The level set method is then applied to automatically segment vessels enhanced by the filtering with a speed function derived from enhanced MRA images. Since the goal of this paper is to obtain highly accurate vessel borders, suitable for use in fluid flow simulations, in a subsequent step, the vessel surface determined by the level set method is triangulated using 3-D Delaunay triangulation and the resulting surface is used as a parametric deformable model. Energy minimization is then performed within a variational setting with a first-order internal energy; the external energy is derived from 3-D image gradients. Using the proposed method, vessels are accurately segmented from MRA data.

Book ChapterDOI
TL;DR: The proposed method connects minutiae using a Delaunay triangulation and analyzes the relative position and orientation of each minutia with respect to its neighbors obtained by the triangle structure.
Abstract: We present a new technique for fingerprint minutiae matching. The proposed method connects minutiae using a Delaunay triangulation and analyzes the relative position and orientation of each minutia with respect to its neighbors obtained by the triangle structure. Due to non-linear deformations, we admit a certain degree of triangle deformation. If rotations and translations are present, the triangle structure does not change consistently. Two fingerprints are considered matching, if their triangle structures are similar according the neighbor relationship. The algorithm performance are evaluated on a public domain database.

Proceedings ArticleDOI
08 Jun 2004
TL;DR: Unlike previous algorithms, this algorithm does not need to compute the local feature size for generating the sample points which was a major bottleneck, and experiments show the usefulness of the algorithm in remeshing and meshing CAD surfaces that are piecewise smooth.
Abstract: This paper presents an algorithm for sampling and triangulatinga smooth surface Σ ⊂ R3 where the triangulation is homeomorphic to Σ. The only assumption we make is that the input surface representation is amenable to certain types of computations, namely computations of the intersection points of a line with the surface, computations of the critical points of some height functions defined on the surface and its restriction to a plane, and computations of some silhouette points. The algorithm ensures bounded aspect ratio, size optimality, and smoothness of the output triangulation. Unlike previous algorithms, this algorithm does not need to compute the local feature size for generating the sample points which was a major bottleneck. Experiments show the usefulness of the algorithm in remeshing and meshing CAD surfaces that are piecewise smooth.

01 Nov 2004
TL;DR: A survey of Delaunay-based surface reconstruction methods can be found in this article, where the authors outline the foundations of these methods from a geometric and algorithmic standpoints.
Abstract: Given a finite sampling $P\subset\mathbbR^d$ of an unknown surface $S$, surface reconstruction is concerned with the calculation of a model of $S$ from $P$. The model can be represented as a smooth or a triangulated surface, and is expected to match $S$ from a topological and geometric standpoints. In this survey, we focus on the recent developments of Delaunay based surface reconstruction methods, which were the first methods (and in a sense still the only ones) for which one can precisely state properties of the reconstructed surface. We outline the foundations of these methods from a geometric and algorithmic standpoints. In particular, a careful presentation of the hypothesis used by these algorithms sheds light on the intrinsic difficulties of the surface reconstruction problem faced by any method, Delaunay based or not.

Book ChapterDOI
01 Jan 2004
TL;DR: A fully Dynamic Constrained Delaunay Triangulation is achieved, able to efficiently maintain a consistent triangulated representation of dynamic polygonal domains.
Abstract: We present algorithms for the efficient insertion and removal of constraints in Delaunay Triangulations. Constraints are considered to be points or any kind of polygonal lines. Degenerations such as edge overlapping, self-intersections or duplicated points are allowed and are automatically detected and fixed on line. As a result, a fully Dynamic Constrained Delaunay Triangulation is achieved, able to efficiently maintain a consistent triangulated representation of dynamic polygonal domains. Several applications in the fields of data visualization, reconstruction, geographic information systems and collision-free path planning are discussed.

Journal ArticleDOI
TL;DR: In this paper, a finite element method with the adaptive Delaunay triangulation as mesh generator is used to analyze two-dimensional crack propagation problems and the resulting stress intensity factors and simulated crack propagation behavior are used to evaluate the effectiveness of the combined method.

Journal ArticleDOI
TL;DR: This paper presents a deterministic algorithm for generating a weighted Delaunay mesh which respects the input boundary and has no poor quality tetrahedron including silvers, and shows that an incremental weight pumping can be mixed seamlessly with vertex insertions in the weightedDelaunay refinement paradigm.
Abstract: Delaunay meshes with bounded circumradius to shortest edge length ratio have been proposed in the past for quality meshing. The only poor quality tetrahedra, called slivers, that can occur in such a mesh can be eliminated by the sliver exudation method. This method has been shown to work for periodic point sets, but not with boundaries. Recently a randomized point-placement strategy has been proposed to remove slivers while conforming to a given boundary. In this paper we present a deterministic algorithm for generating a weighted Delaunay mesh which respects the input boundary and has no poor quality tetrahedron including slivers. As in previous work, we assume that no input angle is acute. Our result is achieved by combining the weight pumping method for sliver exudation and the Delaunay refinement method for boundary conformation.

Journal ArticleDOI
TL;DR: It is shown that there exists a simple online O(1)-memory c-competitive routing strategy that approximates the shortest path in triangulations possessing the diamond property, i.e., the total distance travelled by the algorithm to route a message between two vertices is at most a constant c times the shortest route.

Book
23 Feb 2004
TL;DR: This paper presents a meta-analysis of the distributional results for Cn,m-graphs using Voronoi Cells and Delaunay Triangularization as a model for Nearest Neighbor Prototypes, a type of graph designed to describe the relationship betweenNeighbor and Neighbourhood.
Abstract: Preface.Acknowledgments.1. Preliminaries.1.1 Graphs and Digraphs.1.2 Statistical Pattern Recognition.1.3 Statistical Issues.1.4 Applications.1.5 Further Reading.2. Computational Geometry.2.1 Introduction.2.2 Voronoi Cells and Delaunay Triangularization.2.3 Alpha Hulls.2.4 Minimum Spanning Trees.2.5 Further Reading.3. Neighborhood Graphs.3.1 Introduction.3.2 Nearest-Neighbor Graphs.3.3 k-Nearest Neighbor Graphs.3.4 Relative Neighborhood Graphs.3.5 Gabriel Graphs.3.6 Application: Nearest Neighbor Prototypes.3.7 Sphere of Influence Graphs.3.8 Other Relatives.3.9 Asymptotics.3.10 Further Reading.4. Class Cover Catch Digraphs.4.1 Catch Digraphs.4.2 Class Covers.4.3 Dominating Sets.4.4 Distributional Results for Cn,m-graphs.4.5 Characterizations.4.6 Scale Dimension.4.7 (alpha,beta) Graphs4.8 CCCD Classification.4.9 Homogeneous CCCDs.4.10 Vector Quantization.4.11 Random Walk Version.4.12 Further Reading.5. Cluster Catch Digraphs.5.1 Basic Definitions.5.2 Dominating Sets.5.3 Connected Components.5.4 Variable Metric Clustering.6. Computational Methods.6.1 Introduction.6.2 Kd-Trees.6.3 Class Cover Catch Digraphs.6.4 Cluster Catch Digraphs.6.5 Voroni Regions and Delaunay Triangularizations.6.6 Further Reading.References.Author Index.Subject Index.

Proceedings ArticleDOI
11 Jan 2004
TL;DR: The almost-Delaunay simplices are defined, some of their properties are derived, and algorithms for computing them are given, especially for neighbor analysis in three dimensions.
Abstract: Delaunay tessellations and Voronoi diagrams capture proximity relationships among sets of points in any dimension. When point coordinates are not known exactly, as in the case of 3D points representing protein atom coordinates, the Delaunay tessellation may not be robust; small perturbations in the coordinates may cause the Delaunay simplices to change. In this paper, we define the almost-Delaunay simplices, derive some of their properties, and give algorithms for computing them, especially for neighbor analysis in three dimensions. We sketch applications in proteins that will be described more fully in a companion paper in biology. http://www.cs.unc.edu/∼debug/papers/AlmDel.

Proceedings ArticleDOI
30 Aug 2004
TL;DR: The system, called GeoPeer, aims to combine the advantages of peer-to-peer systems that implement distributed hash tables with the suitability of geographical routing for supporting location-constrained queries and information dissemination.
Abstract: This work presents a novel peer-to-peer system that is particularly well suited to support context-aware computing. The system, called GeoPeer, aims to combine the advantages of peer-to-peer systems that implement distributed hash tables with the suitability of geographical routing for supporting location-constrained queries and information dissemination. GeoPeer is comprised of two fundamental components: a Delaunay triangulation used to build a connected lattice of nodes and a mechanism to manage long range contacts that allows good routing performance, despite unbalanced distribution of nodes.

Proceedings ArticleDOI
08 Jun 2004
TL;DR: This algorithm to compute a Delaunay mesh conforming to a polyhedron possibly with small input angles is simple to implement as it avoids computing local feature sizes and protective zones explicitly.
Abstract: We present an algorithm to compute a Delaunay mesh conforming to a polyhedron possibly with small input angles. The radius-edge ratio ofmost output tetrahedra are bounded by a constant, except possibly those that are provably close to small angles. Further, the mesh is graded, that is, edge lengths are at least a constant fraction of the local feature sizes at the edge endpoints. Unlike a previous algorithm, this algorithm is simple to implement as it avoids computing local feature sizes and protective zones explicitly. Our experimental results confirm our claims and show that few skinny tetrahedra remain.

Proceedings ArticleDOI
08 Jun 2004
TL;DR: This paper obtains the first such space-economical solutions for a number of fundamental problems, including three-dimensional convex hulls, two-dimensional Delaunay triangulations, fixed-dimensional range queries, and fixed- dimensional nearest neighbor queries.
Abstract: For many geometric problems, there are efficient algorithms that surprisingly use very little extra space other than the given array holding the input. For many geometric query problems, there are efficient data structures that need no extra space at all other than an array holding a permutation of the input. In this paper, we obtain the first such space-economical solutions for a number of fundamental problems, including three-dimensional convex hulls, two-dimensional Delaunay triangulations, fixed-dimensional range queries, and fixed-dimensional nearest neighbor queries.

Journal ArticleDOI
TL;DR: It is shown that the complexity of the Delaunay triangulation of points in R3 may be quadratic in the worst case, but it is only linear when the points are distributed on a fixed set of well-sampled facets of R3.
Abstract: Delaunay triangulations and Voronoi diagrams have found numerous applications in surface modeling, surface mesh generation, deformable surface modeling and surface reconstruction. Many algorithms in these applications begin by constructing the three-dimensional Delaunay triangulation of a finite set of points scattered over a surface. Their running-time therefore depends on the complexity of the Delaunay triangulation of such point sets.Although the complexity of the Delaunay triangulation of points in R3 may be quadratic in the worst case, we show in this paper that it is only linear when the points are distributed on a fixed set of well-sampled facets of R3 (e.g. the planar polygons in a polyhedron). Our bound is deterministic and the constants are explicitly given.

Journal ArticleDOI
TL;DR: In this article, the Delaunay triangulation is used to approximate a shortest path between two points p and q in the Delane triangulated graph, whose length is less than or equal to 2π/(3 cos(π/6) times the Euclidean distance |pq|.
Abstract: In a geometric bottleneck shortest path problem, we are given a set S of n points in the plane, and want to answer queries of the following type: given two points p and q of S and a real number L, compute (or approximate) a shortest path between p and q in the subgraph of the complete graph on S consisting of all edges whose lengths are less than or equal to L. We present efficient algorithms for answering several query problems of this type. Our solutions are based on Euclidean minimum spanning trees, spanners, and the Delaunay triangulation. A result of independent interest is the following. For any two points p and q of S, there is a path between p and q in the Delaunay triangulation, whose length is less than or equal to 2π/(3 cos(π/6)) times the Euclidean distance |pq| between p and q, and all of whose edges have length at most |pq|.

Journal ArticleDOI
TL;DR: This work gives an O(nlogn)-time centralized algorithm that constructs a planar t-spanner for V, for such that the degree of each node is bounded from above by , where 0<α<π/2 is an adjustable parameter.
Abstract: Given a set V of n points in a two-dimensional plane, we give an O(nlogn)-time centralized algorithm that constructs a planar t-spanner for V, for such that the degree of each node is bounded from above by , where 0<α<π/2 is an adjustable parameter. Here Cdel is the spanning ratio of the Delaunay triangulation, which is at most . We also show, by applying the greedy method in Ref. [14], how to construct a low weighted bounded degree planar spanner with spanning ratio ρ(α)2(1+∊) and the same degree bound, where ∊ is any positive real constant. Here, a structure is called low weighted if its total edge length is proportional to the total edge length of the Euclidean minimum spanning tree of V. Moreover, we show that our method can be extended to construct a planar bounded degree spanner for unit disk graphs with the adjustable parameter α satisfying 0<α<π/3. Previously, only centralized method6 of constructing bounded degree planar spanner is known, with degree bound 27 and spanning ratio t≃10.02. The distributed implementation of this centralized method takes O(n2) communications in the worst case. Our method can be converted to a localized algorithm where the total number of messages sent by all nodes is at most O(n).

Journal ArticleDOI
01 Sep 2004
TL;DR: This paper proposes to solve the global illumination problem through a progressive rendering method relying on an adaptive sampling of the image space based on an image metric based on a powerful vision model.
Abstract: In this paper, we propose to solve the global illumination problem through a progressive rendering method relying on an adaptive sampling of the image space. The refinement of this sample scheme is driven by an image metric based on a powerful vision model. A Delaunay triangulation of the sampled points is followed by a classification of these triangles into three classes. By interpolating each triangle according to the class it belongs to, we can obtain a high quality image by computing only a fraction of all the pixels and thus saving computation time.

Journal ArticleDOI
TL;DR: Algorithms to implement fully dynamic and kinetic three-dimensional unconstrained Delaunay triangulations, where the time evolution of the triangulation is not only governed by moving vertices but also by a changing number of vertices are described.