scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Dual/Primal mesh optimization for polygonized implicit surfaces

17 Jun 2002-pp 171-178
TL;DR: A new method for improving polygonizations of implicit surfaces with sharp features is proposed, which outperforms approaches based on the mesh evolution paradigm in speed and accuracy.
Abstract: A new method for improving polygonizations of implicit surfaces with sharp features is proposed. The method is based on the observation that, given an implicit surface with sharp features, a triangle mesh whose triangles are tangent to the implicit surface at certain inner triangle points gives a better approximation of the implicit surface than the standard marching cubes mesh Lorensen(in our experiments we use VTK marching cubes VTK). First, given an initial triangle mesh, its dual mesh composed of the triangle centroids is considered. Then the dual mesh is modified such that its vertices are placed on the implicit surface and the mesh dual to the modified dual mesh is considered. Finally the vertex positions of that "double dual" mesh are optimized by minimizing a quadratic energy measuring a deviation of the mesh normals from the implicit surface normals computed at the vertices of the modified dual mesh. In order to achieve an accurate approximation of fine surface features, these basic steps are combined with adaptive mesh subdivision and curvature-weighted vertex resampling. The proposed method outperforms approaches based on the mesh evolution paradigm in speed and accuracy.

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI
01 Jul 2003
TL;DR: A new shape representation is presented, the multi-level partition of unity implicit surface, that allows us to construct surface models from very large sets of points, and can accurately represent sharp features such as edges and corners by selecting appropriate shape functions.
Abstract: We present a new shape representation, the multi-level partition of unity implicit surface, that allows us to construct surface models from very large sets of points. There are three key ingredients to our approach: 1) piecewise quadratic functions that capture the local shape of the surface, 2) weighting functions (the partitions of unity) that blend together these local shape functions, and 3) an octree subdivision method that adapts to variations in the complexity of the local shape.Our approach gives us considerable flexibility in the choice of local shape functions, and in particular we can accurately represent sharp features such as edges and corners by selecting appropriate shape functions. An error-controlled subdivision leads to an adaptive approximation whose time and memory consumption depends on the required accuracy. Due to the separation of local approximation and local blending, the representation is not global and can be created and evaluated rapidly. Because our surfaces are described using implicit functions, operations such as shape blending, offsets, deformations and CSG are simple to perform.

796 citations


Cites methods from "Dual/Primal mesh optimization for p..."

  • ...We find it attractive to combine a low resolution Bloomenthal polygonization with a postprocessing mesh optimization technique developed in [Ohtake and Belyaev 2002], as shown in the top middle, top right, and bottom middle images of Figure 7....

    [...]

  • ...An alternative is to use adaptive polygonization strategies such as those in [Kobbelt et al. 2001; Ohtake and Belyaev 2002; Ju et al. 2002]....

    [...]

Journal ArticleDOI
TL;DR: This paper introduces a method to extract 'Shape-DNA', a numerical fingerprint or signature, of any 2d or 3d manifold by taking the eigenvalues (i.e. the spectrum) of its Laplace-Beltrami operator and succeeds in computing eigen values for smoothly bounded objects without discretization errors caused by approximation of the boundary.
Abstract: This paper introduces a method to extract 'Shape-DNA', a numerical fingerprint or signature, of any 2d or 3d manifold (surface or solid) by taking the eigenvalues (i.e. the spectrum) of its Laplace-Beltrami operator. Employing the Laplace-Beltrami spectra (not the spectra of the mesh Laplacian) as fingerprints of surfaces and solids is a novel approach. Since the spectrum is an isometry invariant, it is independent of the object's representation including parametrization and spatial position. Additionally, the eigenvalues can be normalized so that uniform scaling factors for the geometric objects can be obtained easily. Therefore, checking if two objects are isometric needs no prior alignment (registration/localization) of the objects but only a comparison of their spectra. In this paper, we describe the computation of the spectra and their comparison for objects represented by NURBS or other parametrized surfaces (possibly glued to each other), polygonal meshes as well as solid polyhedra. Exploiting the isometry invariance of the Laplace-Beltrami operator we succeed in computing eigenvalues for smoothly bounded objects without discretization errors caused by approximation of the boundary. Furthermore, we present two non-isometric but isospectral solids that cannot be distinguished by the spectra of their bodies and present evidence that the spectra of their boundary shells can tell them apart. Moreover, we show the rapid convergence of the heat trace series and demonstrate that it is computationally feasible to extract geometrical data such as the volume, the boundary length and even the Euler characteristic from the numerically calculated eigenvalues. This fact not only confirms the accuracy of our computed eigenvalues, but also underlines the geometrical importance of the spectrum. With the help of this Shape-DNA, it is possible to support copyright protection, database retrieval and quality assessment of digital data representing surfaces and solids. A patent application based on ideas presented in this paper is pending.

789 citations


Cites background from "Dual/Primal mesh optimization for p..."

  • ...new concepts [12,50] useful to mesh implicit surfaces....

    [...]

Journal ArticleDOI
TL;DR: This work uses partial differential equation techniques to remove noise from digital images using a total-variation filter to smooth the normal vectors of the level curves of a noise image and finite difference schemes are used to solve these equations.
Abstract: In this work, we use partial differential equation techniques to remove noise from digital images. The removal is done in two steps. We first use a total-variation filter to smooth the normal vectors of the level curves of a noise image. After this, we try to find a surface to fit the smoothed normal vectors. For each of these two stages, the problem is reduced to a nonlinear partial differential equation. Finite difference schemes are used to solve these equations. A broad range of numerical examples are given in the paper.

217 citations

DOI
01 May 2003
TL;DR: The numerical experiments suggest that the approach integrates the best aspects of scattered data fitting with locally and globally supported basis functions and is essentially faster than the state-of-the-art scattered data approximation with globally supported RBFs and much simpler to implement.

214 citations


Cites methods from "Dual/Primal mesh optimization for p..."

  • ...As a postprocessing step we can also employ a method proposed in [28] in order to improve the mesh quality....

    [...]

Proceedings ArticleDOI
12 May 2003
TL;DR: In this article, a hierarchical approach to 3D scattered data interpolation with compactly supported basis functions is proposed, which integrates the best aspects of scattered data fitting with locally and globally supported RBFs.
Abstract: We propose a hierarchical approach to 3D scattered data interpolation with compactly supported basis functions. Our numerical experiments suggest that the approach integrates the best aspects of scattered data fitting with locally and globally supported basis functions. Employing locally supported functions leads to an efficient computational procedure, while a coarse-to-fine hierarchy makes our method insensitive to the density of scattered data and allows us to restore large parts of missed data. Given a point cloud distributed along a surface, we first use spatial down sampling to construct a coarse-to-fine hierarchy of point sets. Then we interpolate the sets starting from the coarsest level. We interpolate a point set of the hierarchy, as an offsetting of the interpolating function computed at the previous level. An original point set and its coarse-to-fine hierarchy of interpolated sets is presented. According to our numerical experiments, the method is essentially faster than the state-of-the-art scattered data approximation with globally supported RBFs (Carr et al., 2001) and much simpler to implement.

202 citations

References
More filters
Proceedings ArticleDOI
01 Aug 1987
TL;DR: In this paper, a divide-and-conquer approach is used to generate inter-slice connectivity, and then a case table is created to define triangle topology using linear interpolation.
Abstract: We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divide-and-conquer approach to generate inter-slice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical data in scan-line order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models. The detail in images produced from the generated surface models is the result of maintaining the inter-slice connectivity, surface data, and gradient information present in the original 3D data. Results from computed tomography (CT), magnetic resonance (MR), and single-photon emission computed tomography (SPECT) illustrate the quality and functionality of marching cubes. We also discuss improvements that decrease processing time and add solid modeling capabilities.

13,231 citations

Journal ArticleDOI

11,285 citations


"Dual/Primal mesh optimization for p..." refers methods in this paper

  • ...Similar to [10] we use the singular value decomposition [17] to find a minimum-norm least squares solution to (1)....

    [...]

Proceedings ArticleDOI
03 Aug 1997
TL;DR: This work has developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models, and which also supports non-manifold surface models.
Abstract: Many applications in computer graphics require complex, highly detailed models. However, the level of detail actually necessary may vary considerably. To control processing time, it is often desirable to use approximations in place of excessively detailed models. We have developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models. The algorithm uses iterative contractions of vertex pairs to simplify models and maintains surface error approximations using quadric matrices. By contracting arbitrary vertex pairs (not just edges), our algorithm is able to join unconnected regions of models. This can facilitate much better approximations, both visually and with respect to geometric error. In order to allow topological joining, our system also supports non-manifold surface models. CR Categories: I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—surface and object representations

3,564 citations