scispace - formally typeset
Search or ask a question

Showing papers on "Delaunay triangulation published in 2006"


Journal ArticleDOI
TL;DR: A one‐parameter family of approximation schemes that bridges continuously two important limits: Delaunay triangulation and maximum‐entropy (max‐ent) statistical inference are presented.
Abstract: We present a one-parameter family of approximation schemes, which we refer to as local maximum-entropy approximation schemes, that bridges continuously two important limits: Delaunay triangulation and maximum-entropy (max-ent) statistical inference. Local max-ent approximation schemes represent a compromise—in the sense of Pareto optimality—between the competing objectives of unbiased statistical inference from the nodal data and the definition of local shape functions of least width. Local max-ent approximation schemes are entirely defined by the node set and the domain of analysis, and the shape functions are positive, interpolate affine functions exactly, and have a weak Kronecker-delta property at the boundary. Local max-ent approximation may be regarded as a regularization, or thermalization, of Delaunay triangulation which effectively resolves the degenerate cases resulting from the lack or uniqueness of the triangulation. Local max-ent approximation schemes can be taken as a convenient basis for the numerical solution of PDEs in the style of meshfree Galerkin methods. In test cases characterized by smooth solutions we find that the accuracy of local max-ent approximation schemes is vastly superior to that of finite elements.

368 citations


Journal ArticleDOI
TL;DR: A novel video summarization technique by using Delaunay clusters that generates good quality summaries with fewer frames and less redundancy when compared to other schemes is proposed.
Abstract: Recent advances in technology have made tremendous amounts of multimedia information available to the general population. An efficient way of dealing with this new development is to develop browsing tools that distill multimedia data as information oriented summaries. Such an approach will not only suit resource poor environments such as wireless and mobile, but also enhance browsing on the wired side for applications like digital libraries and repositories. Automatic summarization and indexing techniques will give users an opportunity to browse and select multimedia document of their choice for complete viewing later. In this paper, we present a technique by which we can automatically gather the frames of interest in a video for purposes of summarization. Our proposed technique is based on using Delaunay Triangulation for clustering the frames in videos. We represent the frame contents as multi-dimensional point data and use Delaunay Triangulation for clustering them. We propose a novel video summarization technique by using Delaunay clusters that generates good quality summaries with fewer frames and less redundancy when compared to other schemes. In contrast to many of the other clustering techniques, the Delaunay clustering algorithm is fully automatic with no user specified parameters and is well suited for batch processing. We demonstrate these and other desirable properties of the proposed algorithm by testing it on a collection of videos from Open Video Project. We provide a meaningful comparison between results of the proposed summarization technique with Open Video storyboard and K-means clustering. We evaluate the results in terms of metrics that measure the content representational value of the proposed technique.

330 citations


Journal ArticleDOI
TL;DR: In this paper, the authors applied Delaunay triangulation to models of known zeolite frameworks and showed that this well-established technique from computational geometry provides for each framework; (i) the location and shape of the open pores and channels, (ii) the diameter of the largest possible included sphere, and indirectly (iii) the largest free-sphere that can diffuse through the framework by at least one lattice translation.

195 citations


Journal ArticleDOI
TL;DR: In this article, a 3D Delaunay triangulation is used to determine the lattice connections and the effective cross-section areas of connecting struts are defined by performing a three-dimensional domain tessellation, which is similar to Voronoi Tessellations.

180 citations


Journal ArticleDOI
TL;DR: The proposed compression method combines the approximation scheme with a customized scattered data coding scheme, which is the Delaunay triangulation of a small set Y of significant pixels, and is compared with JPEG2000 on two geometric images and on three popular test cases of real images.

148 citations


Book ChapterDOI
01 Jan 2006
TL;DR: The surfaces considered in surface reconstruction are 2-manifolds that might have boundaries and are embedded in some Euclidean space R that are generally represented as a triangulated surface that can be directly used by downstream computer programs for further processing.
Abstract: The surfaces considered in surface reconstruction are 2-manifolds that might have boundaries and are embedded in some Euclidean space R. In the surface reconstruction problem we are given only a finite sample P ⊂ R of an unknown surface S. The task is to compute a model of S from P . This model is referred to as the reconstruction of S from P . It is generally represented as a triangulated surface that can be directly used by downstream computer programs for further processing. The reconstruction should match the original surface in terms of geometric and topological properties. In general surface reconstruction is an ill-posed problem since there are several triangulated surfaces that might fulfill these criteria. Note, that this is in contrast to the curve reconstruction problem where the optimal reconstruction is a polygon that connects the sample points in exactly the same way as they are connected along the original curve. The difficulty of meeting geometric or topological criteria depends on properties of the sample and on properties of the sampled surface. In particular, sparsity, redundancy, noisiness of the sample or non-smoothness and boundaries of the surface make surface reconstruction a challenging problem.

146 citations


Journal ArticleDOI
01 Jul 2006
TL;DR: By extending an incremental algorithm for Delaunay triangulation to use finalization tags and produce streaming mesh output, a billion-triangle terrain representation for the Neuse River system is computed from 11.2 GB of LIDAR data in 48 minutes using only 70 MB of memory on a laptop with two hard drives.
Abstract: We show how to greatly accelerate algorithms that compute Delaunay triangulations of huge, well-distributed point sets in 2D and 3D by exploiting the natural spatial coherence in a stream of points. We achieve large performance gains by introducing spatial finalization into point streams: we partition space into regions, and augment a stream of input points with finalization tags that indicate when a point is the last in its region. By extending an incremental algorithm for Delaunay triangulation to use finalization tags and produce streaming mesh output, we compute a billion-triangle terrain representation for the Neuse River system from 11.2 GB of LIDAR data in 48 minutes using only 70 MB of memory on a laptop with two hard drives. This is a factor of twelve faster than the previous fastest out-of-core Delaunay triangulation software.

138 citations


Proceedings Article
16 Jul 2006
TL;DR: This paper presents a method for abstracting an environment represented using constrained Delaunay triangulations in a way that significantly reduces pathfinding search effort, as well as better representing the basic structure of the environment.
Abstract: In this paper we present a method for abstracting an environment represented using constrained Delaunay triangulations in a way that significantly reduces pathfinding search effort, as well as better representing the basic structure of the environment. The techniques shown here are ideal for objects of varying sizes and environments that are not axis-aligned or that contain many dead-ends, long corridors, or jagged walls that complicate other search techniques. In fact, the abstraction simplifies pathfinding to deciding to which side of each obstacle to go. This technique is suited to real-time computation both because of its speed and because it lends itself to an anytime algorithm, allowing it to work when varying amounts of resources are assigned to pathfinding. We test search algorithms running on both the base triangulation (Triangulation A* - TA*) and our abstraction (Triangulation Reduction A* - TRA*) against A* and PRA* on grid-based maps from the commercial games Baldur's Gate and WarCraft III. We find that in these cases almost all paths are found much faster using TA*, and more so using TRA*.

137 citations


Book
01 Jan 2006
TL;DR: The basic theory necessary to construct and manipulate triangulations is presented, including algorithms and software issues, and the theory behind the Delaunay triangulation is given.
Abstract: This book will serve as a valuable source of information about triangulations for the graduate student and researcher. With emphasis on computational issues, it presents the basic theory necessary to construct and manipulate triangulations. In particular, the book gives a tour through the theory behind the Delaunay triangulation, including algorithms and software issues. It also discusses various data structures used for the representation of triangulations.

126 citations


Proceedings ArticleDOI
12 Jul 2006
TL;DR: A centralized sensor deployment method, DT-Score, aims to maximize the coverage of a given sensing area with obstacles and can reach higher coverage than grid-based and random deployment methods with the increasing of deployable sensors.
Abstract: To obtain a satisfied performance of wireless sensor network, an adaptable sensor deployment method for various applications is essential. In this paper, we propose a centralized sensor deployment method, DT-Score, aims to maximize the coverage of a given sensing area with obstacles. The DT-Score consists of two phases. In the first phase, we use a contour-based deployment to eliminate the coverage holes near the boundary of sensing area and obstacles. In the second phase, a deployment method based on the Delaunay triangulation is applied for the uncovered regions. Before deploying a sensor, each candidate position generated from the current sensor configuration is scored by a probabilistic sensor detection model. A new sensor is placed to the position with the most coverage gains. According to the simulation results, DT-Score can reach higher coverage than grid-based and random deployment methods with the increasing of deployable sensors.

86 citations


Journal ArticleDOI
TL;DR: A dual structure of the Voronoi diagram of three-dimensional spheres called a quasi-triangulation is defined and its important properties are presented and a data structure based on arrays is proposed to compactly store the topology of the quasi-Triangulation with a guaranteed query performance.
Abstract: It is well-known that the Voronoi diagram of points and the power diagram for weighted points, such as spheres, are cell complexes, and their respective dual structures, i.e. the Delaunay triangulation and the regular triangulation, are simplicial complexes. Hence, the topologies of these diagrams are usually stored in their dual complexes using a very compact data structure of arrays. The topology of the Voronoi diagram of three-dimensional spheres in the Euclidean distance metric, on the other hand, is stored in a radial edge data structure which is not as compact as the data structure used for the Voronoi diagram of points and the power diagram for weighted points. In this paper, we define a dual structure of the Voronoi diagram of three-dimensional spheres called a quasi-triangulation and present its important properties. Based on the properties of a quasi-triangulation, we propose a data structure, called an interworld data structure, based on arrays to compactly store the topology of the quasi-triangulation with a guaranteed query performance.

Journal ArticleDOI
TL;DR: A robust and fast algorithm for performing astrometry and source cross-identification on lists of two-dimensional points, such as between a catalog and an astronomical image, or between two images, tailored to work efficiently on wide fields with a large number of sources and significant nonlinear distortions.
Abstract: We present a robust and fast algorithm for performing astrometry and source cross-identification on lists of two-dimensional points, such as between a catalog and an astronomical image, or between two images. The method is based on minimal assumptions: the lists can be rotated, magnified, and inverted with respect to each other in an arbitrary way. The algorithm is tailored to work efficiently on wide fields with a large number of sources and significant nonlinear distortions, as long as the distortions can be approximated with linear transformations locally over the scale length of the average distance between the points. The procedure is based on symmetric point matching in a newly defined continuous triangle space that consists of triangles generated by extended Delaunay triangulation. Our software implementation performed at the 99.995% success rate on ∼260,000 frames taken by the HATNet project.

Posted Content
TL;DR: In this paper, the existence of a convex polytope with a given metric on the boundary is shown to be a result of a certain deformation in the class of generalized convex polygons with the given boundary.
Abstract: We present a constructive proof of Alexandrov's theorem regarding the existence of a convex polytope with a given metric on the boundary. The polytope is obtained as a result of a certain deformation in the class of generalized convex polytopes with the given boundary. We study the space of generalized convex polytopes and discover a relation with the weighted Delaunay triangulations of polyhedral surfaces. The existence of the deformation follows from the non-degeneracy of the Hessian of the total scalar curvature of a positively curved generalized convex polytope. The latter is shown to be equal to the Hessian of the volume of the dual generalized polyhedron. We prove the non-degeneracy by generalizing the Alexandrov-Fenchel inequality. Our construction of a convex polytope from a given metric is implemented in a computer program.

Book ChapterDOI
01 Jan 2006
TL;DR: A practical algorithm which extends the basic Delaunay refinement scheme is proposed and generates an isotropic mesh corresponding to a sizing function which can be either user-specified or automatically derived from the geometric data.
Abstract: This paper discusses the problem of refining constrained Delaunay tetrahedralizations (CDTs) into good quality meshes suitable for adaptive numerical simulations. A practical algorithm which extends the basic Delaunay refinement scheme is proposed. It generates an isotropic mesh corresponding to a sizing function which can be either user-specified or automatically derived from the geometric data. Analysis shows that the algorithm is able to produce provable-good meshes, i.e., most output tetrahedra have their circumradius-to-shortest edge ratios bounded, except those in the neighborhood of small input angles. Good mesh conformity can be obtained for smoothly changing sizing information. The algorithm has been implemented. Various examples are provided to illustrate the theoretical aspects and practical performance of the algorithm.

Proceedings ArticleDOI
30 Jul 2006
TL;DR: An incremental algorithm is given to construct an intrinsic Laplace-Beltrami operator together with an overlay structure which captures the relationship between the extrinsic and intrinsic triangulations.
Abstract: The discrete Laplace-Beltrami operator plays a prominent role in many Digital Geometry Processing applications ranging from denoising to parameterization, editing, and physical simulation. The standard discretization uses the cotangents of the angles in the immersed mesh which leads to a variety of numerical problems. We advocate use of the intrinsic Laplace-Beltrami operator. It satis- fies a local maximum principle, guaranteeing, e.g., that no flipped triangles can occur in parameterizations. It also leads to better conditioned linear systems. The intrinsic Laplace-Beltrami operator is based on an intrinsic Delaunay triangulation of the surface. We give an incremental algorithm to construct such triangulations together with an overlay structure which captures the relationship between the extrinsic and intrinsic triangulations. Using a variety of example meshes we demonstrate the numerical benefits of the intrinsic Laplace-Beltrami operator.

Proceedings ArticleDOI
05 Jun 2006
TL;DR: A new measurable quantity is introduced, called the Lipschitz radius, which plays a role similar to that of the local feature size in the smooth setting, but which is well-defined and positive on a much larger class of shapes.
Abstract: In the last decade, a great deal of work has been devoted to the elaboration of a sampling theory for smooth surfaces. The goal was to ensure a good reconstruction of a given surface S from a finite subset E of S. The sampling conditions proposed so far offer guarantees provided that E is sufficiently dense with respect to the local feature size of S, which can be true only if S is smooth since the local feature size vanishes at singular points.In this paper, we introduce a new measurable quantity, called the Lipschitz radius, which plays a role similar to that of the local feature size in the smooth setting, but which is well-defined and positive on a much larger class of shapes. Specifically, it characterizes the class of Lipschitz surfaces, which includes in particular all piecewise smooth surfaces such that the normal deviation is not too large around singular points.Our main result is that, if S is a Lipschitz surface and E is a sample of S such that any point of S is at distance less than a fraction of the Lipschitz radius of S, then we obtain similar guarantees as in the smooth setting. More precisely, we show that the Delaunay triangulation of E restricted to S is a 2-manifold isotopic to S lying at bounded Hausdorff distance from S, provided that its facets are not too skinny.We further extend this result to the case of loose samples. As an application, the Delaunay refinement algorithm we proved correct for smooth surfaces works as well and comes with similar guarantees when applied to Lipschitz surfaces.

Journal ArticleDOI
TL;DR: The proposed @b-shape fully accounts for the size differences among spheres and therefore it is more appropriate for the efficient and correct solution for applications in biological systems such as proteins.
Abstract: The Voronoi diagram of a point set has been extensively used in various disciplines ever since it was first proposed. Its application realms have been even further extended to estimate the shape of point clouds when Edelsbrunner and Mucke introduced the concept of @a-shape based on the Delaunay triangulation of a point set. In this paper, we present the theory of @b-shape for a set of three-dimensional spheres as the generalization of the well-known @a-shape for a set of points. The proposed @b-shape fully accounts for the size differences among spheres and therefore it is more appropriate for the efficient and correct solution for applications in biological systems such as proteins. Once the Voronoi diagram of spheres is given, the corresponding @b-shape can be efficiently constructed and various geometric computations on the sphere complex can be efficiently and correctly performed. It turns out that many important problems in biological systems such as proteins can be easily solved via the Voronoi diagram of atoms in proteins and @b-shapes transformed from the Voronoi diagram.

Book ChapterDOI
01 Jan 2006
TL;DR: A new algorithm is presented, Sparse Voronoi Refinement, that produces a conformal Delaunay mesh in arbitrary dimension with guaranteed mesh size and quality with full proofs at IMR 2006.
Abstract: We present a new algorithm, Sparse Voronoi Refinement, that produces a conformal Delaunay mesh in arbitrary dimension with guaranteed mesh size and quality. Our algorithm runs in output-sensitive time O(n log(L/s)+m), with constants depending only on dimension and on prescribed element shape quality bounds. For a large class of inputs, including integer coordinates, this matches the optimal time bound of Θ(n log n + m). Our new technique uses interleaving: we maintain a sparse mesh as we mix the recovery of input features with the addition of Steiner vertices for quality improvement. This technical report is the long version of an article [HMP06] presented at IMR 2006, and contains full proofs.

Journal ArticleDOI
TL;DR: The EQSM method for the generation of unstructured triangular surface grids is presented and examples, based upon typical aircraft geometries, are included to demonstrate how high quality grids can be efficiently generated on surfaces that exhibit a high degree of geometric complexity.

Journal ArticleDOI
TL;DR: A new triangular mesh adaptivity algorithm for elliptic PDEs that combines a posteriori error estimation with centroidal Voronoi-Delaunay tessellations of domains in two dimensions is proposed and tested and has several very desirable features, including the following.
Abstract: A new triangular mesh adaptivity algorithm for elliptic PDEs that combines a posteriori error estimation with centroidal Voronoi-Delaunay tessellations of domains in two dimensions is proposed and tested. The ability of the first ingredient to detect local regions of large error and the ability of the second ingredient to generate superior unstructured grids result in a mesh adaptivity algorithm that has several very desirable features, including the following. Errors are very well equidistributed over the triangles; at all levels of refinement, the triangles remain very well shaped, even if the grid size at any particular refinement level, when viewed globally, varies by several orders of magnitude; and the convergence rates achieved are the best obtainable using piecewise linear finite elements. This methodology can be easily extended to higher-order finite element approximations or mixed finite element formulations although only the linear approximation is considered in this paper.

Proceedings ArticleDOI
02 Jul 2006
TL;DR: An accurate and efficient fingerprint indexing algorithm, which efficiently retrieves the top N possibly matched candidates from a huge database, which uses novel features, which are insensitive to distortion, formed by the Delaunay triangulation of minutiae set as the representation unit.
Abstract: This paper is concentrated on an accurate and efficient fingerprint indexing algorithm, which efficiently retrieves the top N possibly matched candidates from a huge database. In order to have ability of coping with distorted fingerprints, the proposed algorithm uses novel features, which are insensitive to distortion, formed by the Delaunay triangulation of minutiae set as the representation unit. These features include minutia detail and Delaunay triangle (its handedness, angles, maximum edge, and related angle between orientation field and edges). Experiments on database FVC 2000 and scanned fingerprints with heavy distortion show our algorithm considerably narrows down the search space in fingerprint databases and is also available for distorted fingerprints. We also compared with other indexing approaches, and results show our algorithm has a better performance, especially on fingerprints with heavy distortion. This algorithm has another significant advantage that is it provides the control points for fingerprint distortion compensation.

Journal ArticleDOI
TL;DR: The polynomial preserving recovery (PPR) is used to enhance the finite element eigenvalue approximation and remarkable fourth order convergence is observed for linear elements under structured meshes as well as unstructured initial meshes.
Abstract: The polynomial preserving recovery (PPR) is used to enhance the finite element eigenvalue approximation. Remarkable fourth order convergence is observed for linear elements under structured meshes as well as unstructured initial meshes (produced by the Delaunay triangulation) with the conventional bisection refinement.

Dissertation
01 Jan 2006
TL;DR: It is argued that constructing the VD/DT of the samples that were collected to study the field can be beneficial for extracting meaningful information from it, and the usefulness of this Voronoi-based spatial model is demonstrated with a series of potential applications in geoscience.
Abstract: The objects studied in geoscience are often not man-made objects, but rather the spatial distribution of three-dimensional continuous geographical phenomena such as the salinity of a body of water, the humidity of the air or the percentage of gold in the rock (phenomena that tend to change over time). These are referred to as fields, and their modelling with geographical information systems is problematic because the structures of these systems are usually twodimensional and static. Raster structures (voxels or octrees) are the most popular solutions, but, as I argue in this thesis, they have several shortcomings for geoscientific fields. As an alternative to using rasters for representing and modelling three-dimensional fields, I propose using a new spatial model based the Voronoi diagram (VD) and its dual the Delaunay tetrahedralization (DT). I argue that constructing the VD/DT of the samples that were collected to study the field can be beneficial for extracting meaningful information from it. Firstly, the tessellation of space obtained with the VD gives a clear and consistent definition of neighbourhood for unconnected points in three dimensions, which is useful since geoscientific datasets often have highly anisotropic distributions. Secondly, the efficient and robust reconstruction of the field can be obtained with natural neighbour interpolation, which is entirely based on the properties of the VD. Thirdly, the tessellations of the VD and the DT make possible, and even optimise, several spatial analysis and visualisation operations. A further important consideration is that the VD/DT is locally modifiable (insertion, deletion and movement of points), which permits us to model the temporal dimension, and also to interactively explore a dataset, thus gaining insight by observing on the fly the consequences of manipulations and spatial analysis operations. In this thesis, the development of this new spatial model is from an algorithmic point of view, i.e. I describe in details algorithms to construct, manipulate, analyse and visualise fields represented with the VD/DT. A strong emphasis is put on the implementation of the spatial model, and, for this reason, the many degeneracies that arise in three-dimensional geometric computing are described and handled. A new data structure, the augmented quad-edge, is also presented. It permits us to store simultaneously both the VD and the DT, and helps in the analysis of fields. Finally, the usefulness of this Voronoi-based spatial model is demonstrated with a series of potential applications in geoscience.

Journal ArticleDOI
TL;DR: In this paper, the authors examined differences in catchment geomorphology and hydrology as a result of kriging and Delaunay triangulation gridding methods and concluded that either method is appropriate for the study catchment.
Abstract: Many digital elevation models are gridded from irregularly spaced data. This study examines differences in catchment geomorphology and hydrology as a result of kriging and Delaunay triangulation gridding methods. Both methods have been widely used as tools for placing irregularly spaced data onto a regular grid. In the past, numerical modelling has been performed with little assessment of model input variability and the variability produced by different gridding methods has not been fully assessed. Given the importance of the impact of digital elevation model error and potential impact on model output, little work has been done to understand the impact of this type of potential error. Examination of the different catchment realizations demonstrates that when using kriging (point or block) or Delaunay triangulation there are subtle differences in catchment area and elevation as well as networking properties. Subtle differences also exist in hillslope profile. Nevertheless, when comparing catchment descriptors such as the hypsometric curve, area–slope relationship and cumulative area distribution there is little hydrological and geomorphological difference between these catchments. Further, when these digital elevation models are used as landscape input in a long-term landscape evolution model (i.e. SIBERIA) there is little geomorphological difference or hydrological difference between the two digital elevation models after a 50 000 year modelled period. Consequently either method is appropriate for the study catchment. These findings provide confidence in the conversion of irregularly spaced data onto a regular grid using kriging or Delaunay triangulation and that either method can be used. Copyright © 2006 John Wiley & Sons, Ltd.

DOI
01 Jan 2006
TL;DR: The concept of virtual particles is introduced to implement efficient refinement and coarsification operators, and to achieve a consistent coupling between particles at different resolution levels, leading to speedups of up to a factor of six as compared to single resolution simulations.
Abstract: We present a new multiresolution particle method for fluid simulation. The discretization of the fluid dynamically adapts to the characteristics of the flow to resolve fine-scale visual detail, while reducing the overall complexity ofthe computations. We introduce the concept of virtual particles to implement efficient refinement and coarsification operators, and to achieve a consistent coupling between particles at different resolution levels, leading to speedups of up to a factor of six as compared to single resolution simulations. Our system supports multiphase effects such as bubbles and foam, as well as rigid body interactions, based on a unified particle interaction metaphor. The water-air interface is tracked with a Lagrangian level set approach using a novel Delaunay-based surface contouring method that accurately resolves fine-scale surface detail while guaranteeing preservation of fluid volume.

Proceedings ArticleDOI
05 Jun 2006
TL;DR: An implementation of a compact parallel algorithm for 3D Delaunay tetrahedralization on a 64-processor shared-memory machine that uses a concurrent version of the Bowyer-Watson incremental insertion, and a thread-safe space-efficient structure for representing the mesh.
Abstract: We describe an implementation of a compact parallel algorithm for 3D Delaunay tetrahedralization on a 64-processor shared-memory machine Our algorithm uses a concurrent version of the Bowyer-Watson incremental insertion, and a thread-safe space-efficient structure for representing the mesh Using the implementation we are able to generate significantly larger Delaunay meshes than have previously been generated—10 billion tetrahedra on a 64 processor SMP using 200GB of RAMThe implementation makes use of a locality based relabeling of the vertices that serves three purposes—it is used as part of the space efficient representation, it improves the memory locality, and it reduces the overhead necessary for locks The implementation also makes use of a caching technique to avoid excessive decoding of vertex information, a technique for backing out of insertions that collide, and a shared work queue for maintaining points that have yet to be inserted

Journal ArticleDOI
TL;DR: Locally optimal Delaunay triangulations are constructed to improve previous image approximation schemes and rely on a local optimization procedure, termed exchange, which is addressed, and its complexity is discussed.
Abstract: Locally optimal Delaunay triangulations are constructed to improve previous image approximation schemes. Our construction relies on a local optimization procedure, termed exchange. The efficient implementation of the exchange algorithm is addressed, and its complexity is discussed. The good performance of our improved image approximation is illustrated by numerical comparisons.

Journal ArticleDOI
TL;DR: Matlab public-domain two-dimensional mesh generation package BatTri is easily upgradeable to meet the future demands by the addition of new grid generation algorithms and Delaunay refinement schemes as they are made available.

Journal Article
TL;DR: In this paper, the authors consider the problem of approximating normal and feature sizes of a surface from point cloud data that may be noisy and provide new algorithms for practical and reliable norm and feature approximations.
Abstract: We consider the problem of approximating normal and feature sizes of a surface from point cloud data that may be noisy. These problems are central to many applications dealing with point cloud data. In the noise-free case, the normals and feature sizes can be approximated by the centers of a set of unique large Delaunay balls called polar balls. In presence of noise, polar balls do not necessarily remain large and hence their centers may not be good for normal and feature size approximations. Earlier works suggest that some large Delaunay balls can play the role of polar balls. However, these results were short in explaining how the big Delaunay balls should be chosen for reliable approximations and how the approximation error depends on various factors. We provide new analyses that fill these gaps. In particular, they lead to new algorithms for practical and reliable normal and feature approximations.

Book ChapterDOI
18 Sep 2006
TL;DR: This paper proposes a novel controllable edge clustering method based on Delaunay triangulation to reduce visual clutter for node-link diagrams, which uses curves instead of straight lines to represent links and these curves can be grouped together according to their relative positions and directions.
Abstract: Node-link diagrams are widely used in information visualization to show relationships among data. However, when the size of data becomes very large, node-link diagrams will become cluttered and visually confusing for users. In this paper, we propose a novel controllable edge clustering method based on Delaunay triangulation to reduce visual clutter for node-link diagrams. Our method uses curves instead of straight lines to represent links and these curves can be grouped together according to their relative positions and directions. We further introduce progressive edge clustering to achieve continuous level-of-details for large networks.