scispace - formally typeset
Search or ask a question

Showing papers on "Delaunay triangulation published in 2012"


Book
04 Dec 2012
TL;DR: The authors present algorithms for generating high-quality meshes in polygonal and polyhedral domains and illustrate how to use restricted Delaunay triangulations to extend the algorithms to surfaces with ridges and patches and volumes with smooth surfaces.
Abstract: Written by authors at the forefront of modern algorithms research, Delaunay Mesh Generation demonstrates the power and versatility of Delaunay meshers in tackling complex geometric domains ranging from polyhedra with internal boundaries to piecewise smooth surfaces. Covering both volume and surface meshes, the authors fully explain how and why these meshing algorithms work. The book is one of the first to integrate a vast amount of cutting-edge material on Delaunay triangulations. It begins with introducing the problem of mesh generation and describing algorithms for constructing Delaunay triangulations. The authors then present algorithms for generating high-quality meshes in polygonal and polyhedral domains. They also illustrate how to use restricted Delaunay triangulations to extend the algorithms to surfaces with ridges and patches and volumes with smooth surfaces. For researchers and graduate students, the book offers a rigorous theoretical analysis of mesh generation methods. It provides the necessary mathematical foundations and core theoretical results upon which researchers can build even better algorithms in the future. For engineers, the book shows how the algorithms work well in practice. It explains how to effectively implement them in the design and programming of mesh generation software.

284 citations


Journal ArticleDOI
TL;DR: An automatic crack propagation modelling technique using polygon elements is presented in this article, where a simple algorithm to generate a polygon mesh from a Delaunay triangulated mesh is implemented The polygon element formulation is constructed from the scaled boundary finite element method (SBFEM), treating each polygon as a SBFEM subdomain and is very efficient in modelling singular stress fields in the vicinity of cracks.
Abstract: SUMMARY An automatic crack propagation modelling technique using polygon elements is presented A simple algorithm to generate a polygon mesh from a Delaunay triangulated mesh is implemented The polygon element formulation is constructed from the scaled boundary finite element method (SBFEM), treating each polygon as a SBFEM subdomain and is very efficient in modelling singular stress fields in the vicinity of cracks Stress intensity factors are computed directly from their definitions without any nodal enrichment functions An automatic remeshing algorithm capable of handling any n-sided polygon is developed to accommodate crack propagation The algorithm is simple yet flexible because remeshing involves minimal changes to the global mesh and is limited to only polygons on the crack paths The efficiency of the polygon SBFEM in computing accurate stress intensity factors is first demonstrated for a problem with a stationary crack Four crack propagation benchmarks are then modelled to validate the developed technique and demonstrate its salient features The predicted crack paths show good agreement with experimental observations and numerical simulations reported in the literature Copyright © 2012 John Wiley & Sons, Ltd

191 citations


Journal ArticleDOI
TL;DR: In this paper, the authors propose a method for upscaling incompressible viscous flow in large random polydispersed sphere packings, where the emphasis is on the determination of the forces applied on the solid particles by the fluid.
Abstract: We propose a method for effectively upscaling incompressible viscous flow in large random polydispersed sphere packings: the emphasis of this method is on the determination of the forces applied on the solid particles by the fluid. Pore bodies and their connections are defined locally through a regular Delaunay triangulation of the packings. Viscous flow equations are upscaled at the pore level, and approximated with a finite volume numerical scheme. We compare numerical simulations of the proposed method to detailed finite element simulations of the Stokes equations for assemblies of 8–200 spheres. A good agreement is found both in terms of forces exerted on the solid particles and effective permeability coefficients.

100 citations


Journal ArticleDOI
TL;DR: In this paper, a new formula for the evaluation of the modal radiation Q factor is derived, which is based on the electric field integral equation, Delaunay triangulation, method of moments, Rao-Wilton-Glisson basis function and the theory of characteristic modes.
Abstract: A new formula for the evaluation of the modal radiation Q factor is derived. The total Q of selected structures is to be calculated from the set of eigenmodes with associated eigen-energies and eigen-powers. Thanks to the analytical expression of these quantities, the procedure is highly accurate, respecting arbitrary current densities flowing along the radiating device. The electric field integral equation, Delaunay triangulation, method of moments, Rao-Wilton-Glisson basis function and the theory of characteristic modes constitute the underlying theoretical background. In terms of the modal radiation Q, all necessary relations are presented and the essential points of implementation are discussed. Calculation of the modal energies and Q factors enable us to study the effect of the radiating shape separately to the feeding. This approach can be very helpful in antenna design. A few examples are given, including a thin-strip dipole, two coupled dipoles a bowtie antenna and an electrically small meander folded dipole. Results are compared with prior estimates and some observations are discussed. Good agreement is observed for different methods.

86 citations


Journal ArticleDOI
TL;DR: A new density-based spatial clustering algorithm (DBSC) is developed by considering both spatial proximity and attribute similarity, and it is found that objects in the same cluster detected by the DBSC algorithm are proximal in a spatial domain and similar in an attribute domain.

83 citations


Journal ArticleDOI
TL;DR: This paper introduces a simplex stochastic collocation (SSC) method, as a multielement UQ method based on simplex elements, that can efficiently discretize nonhypercube probability spaces and achieves superlinear convergence and a linear increase of the initial number of samples with increasing dimensionality.
Abstract: Stochastic collocation (SC) methods for uncertainty quantification (UQ) in computational problems are usually limited to hypercube probability spaces due to the structured grid of their quadrature rules. Nonhypercube probability spaces with an irregular shape of the parameter domain do, however, occur in practical engineering problems. For example, production tolerances and other geometrical uncertainties can lead to correlated random inputs on nonhypercube domains. In this paper, a simplex stochastic collocation (SSC) method is introduced, as a multielement UQ method based on simplex elements, that can efficiently discretize nonhypercube probability spaces. It combines the Delaunay triangulation of randomized sampling at adaptive element refinements with polynomial extrapolation to the boundaries of the probability domain. The robustness of the extrapolation is quantified by the definition of the essentially extremum diminishing (EED) robustness principle. Numerical examples show that the resulting SSC-EED method achieves superlinear convergence and a linear increase of the initial number of samples with increasing dimensionality. These properties are demonstrated for uniform and nonuniform distributions, and correlated and uncorrelated parameters in problems with 15 dimensions and discontinuous responses.

72 citations


Proceedings ArticleDOI
16 May 2012
TL;DR: A modified virtual force-based node self-deployment algorithm for nodes with mobility is proposed, which has higher coverage rate and faster convergence time than traditional virtual force algorithm.
Abstract: The effectiveness of wireless sensor networks (WSN) depends on the coverage and connectivity provided by node deployment, which is one of the key topics in WSN. In this paper, a modified virtual force-based node self-deployment algorithm for nodes with mobility is proposed. In the virtual force-based approach, all nodes are seen as points subject to repulsive and attractive force exerted among them, nodes can move according to the calculated force. In the proposed approach, Delaunay triangulation is formed with these nodes, adjacent relationship is defined if two nodes are connected in the Delaunay diagram. Force can only be exerted from those adjacent nodes within the communication range. Simulation results showed that the proposed approach has higher coverage rate and faster convergence time than traditional virtual force algorithm.

70 citations


Journal ArticleDOI
TL;DR: In this article, the existence of stationary Gibbsian point processes for interactions that act on hyperedges between the points is established, and the basic tools are an entropy bound and stationarity.
Abstract: We establish the existence of stationary Gibbsian point processes for interactions that act on hyperedges between the points. For example, such interactions can depend on Delaunay edges or triangles, cliques of Voronoi cells or clusters of k-nearest neighbors. The classical case of pair interactions is also included. The basic tools are an entropy bound and stationarity.

64 citations


Journal ArticleDOI
TL;DR: In this article, a singular seven-node crack-tip triangular element with edge-based strain smoothing is formulated for modeling crack growth in solids, and the stiffness matrix is obtained using the assumed displacement values (not the derivatives) over smoothing domains associated with the edges of elements.

61 citations


Patent
12 Dec 2012
TL;DR: In this paper, a system for modeling a fractured medium is presented, which discretizes fractures in a representation of the fractured medium, with the discretizing including defining points along the fractures and edges extending between adjacent points.
Abstract: Systems and methods for modeling a fractured medium are provided. The method includes discretizing fractures in a representation of the fractured medium, with the discretizing including defining points along the fractures and edges extending between adjacent points. The method also includes determining that at least one of the edges is a non-Gabriel edge, and removing the non-Gabriel edge from the representation. The method further includes approximating the removed non-Gabriel edge to generate an approximated edge, and inserting the approximated edge into the representation.

58 citations


Journal ArticleDOI
TL;DR: This paper presents a novel method for addressing the problem of finding more good feature pairs between images, which is one of the most fundamental tasks in computer vision and pattern recognition and significantly improves both the number of correct matches and the matching score.

Journal ArticleDOI
TL;DR: A new algorithm is introduced to directly reconstruct geometric models of building facades from terrestrial laser scanning data without using either manual intervention or a third-party, computer-aided design (CAD) package.
Abstract: A new algorithm is introduced to directly reconstruct geometric models of building facades from terrestrial laser scanning data without using either manual intervention or a third-party, computer-aided design (CAD) package. The algorithm detects building boundaries and features and converts the point cloud data into a solid model appropriate for computational modeling. The algorithm combines a voxel-based technique with a Delaunay triangulation–based criterion. In the first phase, the algorithm detects boundary points of the facade and its features from the raw data. Subsequently, the algorithm determines whether holes are actual openings or data deficits caused by occlusions and then fills unrealistic openings. The algorithm’s second phase creates a solid model using voxels in an octree representation. The algorithm was applied to the facades of three masonry buildings, successfully detected all openings, and correctly reconstructed the facade boundaries. Geometric validation of the models agains...

Journal ArticleDOI
TL;DR: Five distinct algorithms are compared in five aspects: algorithm verification, algorithm analysis, performance evaluation, end effects correction, and areal extension for morphological filters.
Abstract: Morphological filters, regarded as the complement of mean-line based filters, are useful in the analysis of surface texture and the prediction of functional performance. The paper first recalls two existing algorithms, the naive algorithm and the motif combination algorithm, originally developed for the traditional envelope filter. With minor extension, they could be used to compute morphological filters. A recent novel approach based on the relationship between the alpha shape and morphological closing and opening operations is presented as well. Afterwards two novel algorithms are developed. By correlating the convex hull and morphological operations, the Graham scan algorithm, original developed for the convex hull is modified to compute the morphological envelopes. The alpha shape method depending on the Delaunay triangulation is costly and redundant for the computation for the alpha shape for a given radius. A recursive algorithm is proposed to solve this problem. A series of observations are presented for searching the contact points. Based on the proposed observations, the algorithm partitions the profile data into small segments and searches the contact points in a recursive manner. The paper proceeds to compare the five distinct algorithms in five aspects: algorithm verification, algorithm analysis, performance evaluation, end effects correction, and areal extension. By looking into these aspects, the merits and shortcomings of these algorithms are evaluated and compared.

Proceedings ArticleDOI
25 Jun 2012
TL;DR: A new registration-free Delaunay triangle-based fuzzy extractor is proposed that not only can mitigate biometric uncertainty but also eliminate the feature pre-alignment process in fingerprint authentication.
Abstract: Bio-cryptography is a new security technology which combines cryptography with biometrics. Fuzzy extractors are effective in terms of binding a cryptographic key to biometric features. However, most existing fuzzy extractors require fingerprint registration prior to the application of fuzzy extractors, and depend on error-correction codes to rectify the biometric uncertainty. This is not operative in practice due to low matching performance. In this paper, by taking full advantage of a Delaunay triangulation net, e.g. local structural stability, we propose a new registration-free Delaunay triangle-based fuzzy extractor. The new fuzzy extractor not only can mitigate biometric uncertainty but also eliminate the feature pre-alignment process in fingerprint authentication. Experimental results show that the proposed scheme achieves a better performance than those of the those of existing registration-based fuzzy extractor methods.

Journal ArticleDOI
19 Apr 2012-Sensors
TL;DR: A guided wavelet transform (WT) based deployment strategy (WTDS) for 3D terrains, in which the sensor movements are carried out within the mutation phase of the genetic algorithms (GAs) is proposed, which aims to maximize the Quality of Coverage (QoC) of a WSN via deploying a limited number of sensors on a 3D surface.
Abstract: One of the most critical issues of Wireless Sensor Networks (WSNs) is the deployment of a limited number of sensors in order to achieve maximum coverage on a terrain. The optimal sensor deployment which enables one to minimize the consumed energy, communication time and manpower for the maintenance of the network has attracted interest with the increased number of studies conducted on the subject in the last decade. Most of the studies in the literature today are proposed for two dimensional (2D) surfaces; however, real world sensor deployments often arise on three dimensional (3D) environments. In this paper, a guided wavelet transform (WT) based deployment strategy (WTDS) for 3D terrains, in which the sensor movements are carried out within the mutation phase of the genetic algorithms (GAs) is proposed. The proposed algorithm aims to maximize the Quality of Coverage (QoC) of a WSN via deploying a limited number of sensors on a 3D surface by utilizing a probabilistic sensing model and the Bresenham's line of sight (LOS) algorithm. In addition, the method followed in this paper is novel to the literature and the performance of the proposed algorithm is compared with the Delaunay Triangulation (DT) method as well as a standard genetic algorithm based method and the results reveal that the proposed method is a more powerful and more successful method for sensor deployment on 3D terrains.

Journal ArticleDOI
S.H. Lo1
TL;DR: The parallel 3D Delaunay triangulation algorithm has been coded in Intel FORTRAN VS2010 and the scalability of the parallel zonal insertion algorithm has also been tested on a proper multi-core machine with 12 processors running on OpenMP parallel directives with shared memory.

Journal ArticleDOI
TL;DR: In this article, a point to triangular patch (i.e., closest three points) match is established by checking if the point falls within the triangular dipyramid, which has the three triangular patch points as a base and a user-chosen normal distance as the height to establish the two peaks.
Abstract: The registration of multiple surface point clouds into a common reference frame is a well addressed topic, and the Iterative Closest Point (ICP) is – perhaps – the most used method when registering laser scans due to their irregular nature. In this paper, we examine the proposed Iterative Closest Projected Point (ICPP) algorithm for the simultaneous registration of multiple point clouds. First, a point to triangular patch (i.e. closest three points) match is established by checking if the point falls within the triangular dipyramid, which has the three triangular patch points as a base and a user-chosen normal distance as the height to establish the two peaks. Then, the point is projected onto the patch surface, and its projection is then used as a match for the original point. It is also shown through empirical experimentation that the Delaunay triangles are not a requirement for establishing matches. In fact, Delaunay triangles in some scenarios may force blunders into the final solution, while using the closest three points leads to avoiding some undesired erroneous points. In addition, we review the algorithm by which the ICPP is inspired, namely, the Iterative Closest Patch (ICPatch); where conjugate point-patch pairs are extracted in the overlapping surface areas, and the transformation parameters between all neighbouring surfaces are estimated in a pairwise manner. Then, using the conjugate point-patch pairs, and applying the transformation parameters from the pairwise registration as initial approximations, the final surface transformation parameters are solved for simultaneously. Finally, we evaluate the assumptions made and examine the performance of the new algorithm against the ICPatch.

01 Jan 2012
TL;DR: In this paper, Voronoi and Delaunay describe the flip-graphs as a triangulation of the triangulated triangulations of triangulates of the tesselations of a tesselated tesselation.
Abstract: Keywords: Voronoi ; Delaunay ; tesselations ; triangulations ; flip-graphs Reference EPFL-CHAPTER-181846 URL: http://www.math.uiuc.edu/documenta/vol-ismp/60_liebling-thomas.html Record created on 2012-10-19, modified on 2016-08-09

Book ChapterDOI
01 Jan 2012
TL;DR: This article presents here for the first time a unified discussion of this topic for Voronoi and Delaunay quantization and illustrates the performances of both methods by several numerical examples.
Abstract: We review in this article pure quantization methods for the pricing of multiple exercise options. These quantization methods have the common advantage, that they allow a straightforward implementation of the Backward Dynamic Programming Principle for optimal stopping and stochastic control problems. Moreover we present here for the first time a unified discussion of this topic for Voronoi and Delaunay quantization and illustrate the performances of both methods by several numerical examples.

Journal ArticleDOI
TL;DR: This paper presents the development and application of a two-dimensional, automatic unstructured mesh generator for shallow water models called Admesh, and several meshes of shallow water domains created by the new mesh generator are presented.
Abstract: In this paper, we present the development and application of a two-dimensional, automatic unstructured mesh generator for shallow water models called Admesh. Starting with only target minimum and maximum element sizes and points defining the boundary and bathymetry/ topography of the domain, the goal of the mesh generator is to automatically produce a high-quality mesh from this minimal set of input. From the geometry provided, properties such as local features, curvature of the boundary, bathymetric/topographic gradients, and approximate flow characteristics can be extracted, which are then used to determine local element sizes. The result is a high-quality mesh, with the correct amount of refinement where it is needed to resolve all the geometry and flow characteristics of the domain. Techniques incorporated include the use of the so-called signed distance function, which is used to determine critical geometric properties, the approximation of piecewise linear coastline data by smooth cubic splines, a so-called mesh function used to determine element sizes and control the size ratio of neighboring elements, and a spring-based force equilibrium approach used to improve the element quality of an initial mesh obtained from a simple Delaunay triangulation. Several meshes of shallow water domains created by the new mesh generator are presented.

Journal ArticleDOI
TL;DR: This work exploits the visual nature of pit patterns on the colonic mucosa to develop a faster and more robust approach against overfitting when compared to other methods.

Proceedings ArticleDOI
09 Mar 2012
TL;DR: This work proposes the first graphics processing unit (GPU) solution to compute the 2D constrained Delaunay triangulation (CDT) of a planar straight line graph (PSLG) consisting of points and edges using the CUDA programming model on NVIDIA GPUs, and accelerates the entire computation on the GPU.
Abstract: We propose the first GPU solution to compute the 2D constrained Delaunay triangulation (CDT) of a planar straight line graph (PSLG) consisting of points and edges. There are many CPU algorithms developed to solve the CDT problem in computational geometry, yet there has been no known prior approach using the parallel computing power of the GPU to solve this problem efficiently. For the special case of the CDT problem with a PSLG consisting of just points, which is the normal Delaunay triangulation problem, a hybrid approach has already been presented that uses the GPU together with the CPU to partially speed up the computation. Our work, on the other hand, accelerates the whole computation by the GPU. Our implementation using the CUDA programming model on NVIDIA GPUs is numerically robust with good speedup, of up to an order of magnitude, compared to the best sequential implementations on the CPU. This result is reflected in our experiment with both randomly generated PSLGs and real world GIS data with millions of points and edges.

21 Mar 2012
TL;DR: MMG3D is a tetrahedral fully automatic remesher that produces quasi-uniform meshes with respect to a metric tensor field that prescribes a length and a direction for the edges so that the resulting meshes will be anisotropic.
Abstract: MMG3D is a tetrahedral fully automatic remesher. Starting from a tetrahedral mesh, it produces quasi-uniform meshes with respect to a metric tensor field. This tensor prescribes a length and a direction for the edges, so that the resulting meshes will be anisotropic. The software is based on local mesh modifications and an anisotropic version of Delaunay kernel is implemented to insert vertices in the mesh. Moreover, {\mmg} allows one to deal with rigid body motion and moving meshes. When a displacement is prescribed on a part of the boundary, a final mesh is generated such that the surface points will be moved according this displacement. More details can be found on http://www.math.u-bordeaux1.fr/~dobj/logiciels/mmg3d.php.

Journal ArticleDOI
Tamal K. Dey1, Xiaoyin Ge1, Qichao Que1, Issam Safa1, Lei Wang1, Yusu Wang1 
TL;DR: This paper allows the presence of various singularities by requiring that the sampled object is a collection of smooth surface patches with boundaries that can meet or intersect.
Abstract: Reconstructing a surface mesh from a set of discrete point samples is a fundamental problem in geometric modeling. It becomes challenging in presence of ‘singularities’ such as boundaries, sharp features, and non-manifolds. A few of the current research in reconstruction have addressed handling some of these singularities, but a unified approach to handle them all is missing. In this paper we allow the presence of various singularities by requiring that the sampled object is a collection of smooth surface patches with boundaries that can meet or intersect. Our algorithm first identifies and reconstructs the features where singularities occur. Next, it reconstructs the surface patches containing these feature curves. The identification and reconstruction of feature curves are achieved by a novel combination of the Gaussian weighted graph Laplacian and the Reeb graphs. The global reconstruction is achieved by a method akin to the well known Cocone reconstruction, but with weighted Delaunay triangulation that allows protecting the feature samples with balls. We provide various experimental results to demonstrate the effectiveness of our feature-preserving singular surface reconstruction algorithm. © 2012 Wiley Periodicals, Inc.

Journal ArticleDOI
TL;DR: The purpose of this paper is to describe a phyllotactic organization of points through its Voronoi cells and Delaunay triangulation and to refer to the concept of defects developed in condensed matter physics.
Abstract: Phyllotaxis, the search for the most homogeneous and dense organizations of small discs inside a large circular domain, was first developed to analyse arrangements of leaves or florets in plants. It has since become an object of study not only in botany, but also in mathematics, computer simulations and physics. Although the mathematical solution is now well known, an algorithm setting out the centres of the small discs on a Fermat spiral, the very nature of this organization and its properties of symmetry remain to be examined. The purpose of this paper is to describe a phyllotactic organization of points through its Voronoi cells and Delaunay triangulation and to refer to the concept of defects developed in condensed matter physics. The topological constraint of circular symmetry introduces an original inflation–deflation symmetry taking the place of the translational and rotational symmetries of classical crystallography.

Journal ArticleDOI
TL;DR: This work presents an integrated approach called boundary-optimized Delaunay triangulation (B-ODT) to smooth (improve) a tetrahedral mesh and can be readily adapted to preserve sharp features in the original mesh.

Journal ArticleDOI
TL;DR: Characteristics of the MOCUM make it a perfect tool for high fidelity full core calculation for current and GenIV reactor core designs and can enhance the safety margins with acceptable confidence levels, which lead to more economically optimized designs.

01 Jan 2012
TL;DR: This work proposes a new algorithm for vertex insertion, given a new vertex to be inserted into a CDT, that guarantees a new CDT including that vertex, and adds one or more Steiner points incrementally.
Abstract: Constrained Delaunay tetrahedralizations (CDTs) are valuable for generating meshes of nonconvex domains and domains with internal boundaries, but they are difficult to maintain robustly when finite-precision coordinates yield vertices on a line that are not perfectly collinear and polygonal facets that are not perfectly flat. We experimentally compare two recent algorithms for inserting a polygonal facet into a CDT: a bistellar flip algorithm of Shewchuk (Proc. 19th Annual Symposium on Computational Geometry, June 2003) and a cavity retriangulation algorithm of Si and Gartner (Proc. Fourteenth International Meshing Roundtable, September 2005). We modify these algorithms to succeed in practice for polygons whose vertices deviate from exact coplanarity.

Journal ArticleDOI
TL;DR: Given a set P of n points in the plane, it is shown how to compute in O(nlogn) time a spanning subgraph of their Delaunay triangulation that has maximum degree 7 and is a strong plane t-spanner of P with t=(1+2)^[email protected][email protected], where @d is the spanning ratio of the Delaunays.

Proceedings ArticleDOI
01 Dec 2012
TL;DR: This paper presents a novel relay node placement heuristics called Incremental Optimization based on Delaunay Triangulation (IO-DT), which takes advantage of feasibility of finding optimal solution for the case of three terminals and is superior to competing schemes.
Abstract: Relay node placement in wireless sensor networks has gained importance due to its potential use in prolonging network life time, reducing data latency, and establishing connected topologies. In this paper we studied the relay node placement problem to establish multi-hop communication paths between every pair terminals (i.e., sensors) where each hop in the path is less than a common communication range. Such a problem is defined as Steiner Tree problem with minimum Steiner points and Bounded Edge-Length problem which is known to be NP-Hard. This paper presents a novel relay node placement heuristics called Incremental Optimization based on Delaunay Triangulation (IO-DT). The algorithm takes advantage of feasibility of finding optimal solution for the case of three terminals. IO-DT calculates the Delaunay triangulation (DT) of terminals and iterates over the formed triangles. In each iteration the algorithm steinerizes a triangle as part of the final topology if selecting such a triangle provides a reduction in total number of relay node required as compared to the minimum spanning tree (mst) based approach. The time complexity of IO-DT is quadratic in the number of terminals, which is superior to competing schemes. The performance of the algorithm is validated through simulation.