scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Decimation of triangle meshes

01 Jul 1992-Vol. 26, Iss: 2, pp 65-70
TL;DR: An application independent algorithm that uses local operations on geometry and topology to reduce the number of triangles in a triangle mesh and results from two different geometric modeling applications illustrate the strengths of the algorithm.
Abstract: The polygon remains a popular graphics primitive for computer graphics application. Besides having a simple representation, computer rendering of polygons is widely supported by commercial graphics hardware and software. However, because the polygon is linear, often thousands or millions of primitives are required to capture the details of complex geometry. Models of this size are generally not practical since rendering speeds and memory requirements are proportional to the number of polygons. Consequently applications that generate large polygonal meshes often use domain-specific knowledge to reduce model size. There remain algorithms, however, where domainspecific reduction techniques are not generally available or appropriate. One algorithm that generates many polygons is marching cubes. Marching cubes is a brute force surface construction algorithm that extracts isodensity surfaces from volume data, producing from one to five triangles within voxels that contain the surface. Although originally developed for medical applications, marching cubes has found more frequent use in scientific visualization where the size of the volume data sets are much smaller than those found in medical applications. A large computational fluid dynamics volume could have a finite difference grid size of order 100 by 100 by 100, while a typical medical computed tomography or magnetic resonance scanner produces over 100 slices at a resolution of 256 by 256 or 512 by 512 pixels each. Industrial computed tomography, used for inspection and analysis, has even greater resolution, varying from 512 by 512 to 1024 by 1024 pixels. For these sampled data sets, isosurface extraction using marching cubes can produce from 500k to 2,000k triangles. Even today’s graphics workstations have trouble storing and rendering models of this size. Other sampling devices can produce large polygonal models: range cameras, digital elevation data, and satellite data. The sampling resolution of these devices is also improving, resulting in model sizes that rival those obtained from medical scanners. This paper describes an application independent algorithm that uses local operations on geometry and topology to reduce the number of triangles in a triangle mesh. Although our implementation is for the triangle mesh, it can be directly applied to the more general polygon mesh. After describing other work related to model creation from sampled data, we describe the triangle decimation process and its implementation. Results from two different geometric modeling applications illustrate the strengths of the algorithm.

Summary (2 min read)

Introduction

  • Besides having a simple representation, computer rendering of polygons is widely supported by commercial graphics hardware and software.
  • Marching cubes is a brute force surface construction algorithm that extracts isodensity surfaces from volume data, producing from one to five triangles within voxels that contain the surface.
  • Industrial computed tomography, used for inspection and analysis, has even greater resolution, varying from 512 by 512 to 1024 by 1024 pixels.
  • The goal of the decimation algorithm is to reduce the total number of triangles in a triangle mesh, preserving the original topology and a good approximation to the original geometry.

2.1 OVERVIEW

  • Multiple passes are made over all vertices in the mesh.
  • During a pass, each vertex is a candidate for removal and, if it meets the specified decimation criteria, the vertex and all triangles that use the vertex are deleted.
  • The vertex removal process repeats, with possible adjustment of the decimation criteria, until some termination condition is met.
  • Usually the termination criterion is specified as a percent reduction of the original mesh (or equivalent), or as some maximum decimation value.
  • The three steps of the algorithm are: 1. characterize the local vertex geometry and topology, 2. evaluate the decimation criteria, and 3. triangulate the resulting hole.

2.2 CHARACTERIZING LOCAL

  • The first step of the decimation algorithm characterizes the local geometry and topology for a given vertex.
  • Each vertex may be assigned one of five possible classifications: simple, complex, boundary, interior edge, or corner vertex.
  • Examples of each type are shown in the figure below.
  • A simple vertex is surrounded by a complete cycle of Simple Complex Boundary Interior Edge Corner triangles, and each edge that uses the vertex is used by exactly two triangles.
  • All other vertices become candidates for deletion.

2.3 EVALUATING THE DECIMATION CRITERIA

  • The characterization step produces an ordered loop of vertices and triangles that use the candidate vertex.
  • The evaluation step determines whether the triangles forming the loop can be deleted and replaced by another triangulation exclusive of the original vertex.
  • Boundary and interior edge vertices use the distance to edge criterion .
  • Meshes may contain areas of relatively small triangles with large feature angles, contributing relatively little to the geometric approximation.
  • The authors call this edge preservation, a user specifiable parameter.

2.4 TRIANGULATION

  • Deleting a vertex and its associated triangles creates one (simple or boundary vertex) or two loops (interior edge vertex).
  • In addition, there are two important characteristics of the loop that can be used to advantage.
  • Second, since every loop is star-shaped, triangulation schemes based on recursive loop splitting are effective.
  • The next section describes one such scheme.
  • Once the triangulation is complete, the original vertex and its cycle of triangles are deleted.

3.1 DATA STRUCTURES

  • The data structure must contain at least two pieces of information: the geometry, or coordinates, of each vertex, and the definition of each triangle in terms of its three vertices.
  • Their implementation uses a space-efficient vertex-triangle hierarchical ring structure.
  • Taken together these pointers form a ring relationship.
  • The authors implementation uses three lists: a list of vertex coordinates, a list of triangle definitions, and another list of lists of triangles using each vertex.
  • Edge definitions are not explicit, instead edges are implicitly defined as ordered vertex pairs in the triangle definition.

3.2 TRIANGULATION

  • The authors chose a recursive loop splitting procedure.
  • In order to determine whether the split forms two non-overlapping loops, the split plane is used for a half-space comparison.
  • Certain special cases may occur during the triangulation process.
  • Eliminating a vertex in this case would modify the topology of the mesh.
  • Two different applications illustrate the triangle decimation algorithm.

4.1 VOLUME MODELING

  • The first application applies the decimation algorithm to isosurfaces created from medical and industrial computed tomography scanners.
  • Earlier work reported a triangle reduction strategy that used averaging to reduce the number of triangles on this same data set.
  • The first set of figures shows the resulting bone isosurfaces for 0%, 75%, and 90% decimation, using a decimation threshold of 1/5 the voxel dimension.
  • The isosurface created from the original blade data contains 1.7 million triangles.
  • In fact, the authors could not render the original model because they exceeded the swap space on their graphics hardware.

4.2 TERRAIN MODELING

  • The authors applied the decimation algorithm to two digital elevation data sets: Honolulu, Hawaii and the Mariner Valley on Mars.
  • First the authors applied a decimation threshold of zero, eliminating over 30% of the co-planar triangles.
  • The data represents the western end of the Mariner Valley and is about 1000km by 500km on a side.
  • The last set of figures compares the shaded and wireframe models obtained via sub-sampling and decimation.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI
03 Aug 1997
TL;DR: This work has developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models, and which also supports non-manifold surface models.
Abstract: Many applications in computer graphics require complex, highly detailed models. However, the level of detail actually necessary may vary considerably. To control processing time, it is often desirable to use approximations in place of excessively detailed models. We have developed a surface simplification algorithm which can rapidly produce high quality approximations of polygonal models. The algorithm uses iterative contractions of vertex pairs to simplify models and maintains surface error approximations using quadric matrices. By contracting arbitrary vertex pairs (not just edges), our algorithm is able to join unconnected regions of models. This can facilitate much better approximations, both visually and with respect to geometric error. In order to allow topological joining, our system also supports non-manifold surface models. CR Categories: I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—surface and object representations

3,564 citations


Cites methods from "Decimation of triangle meshes"

  • ...Our simplification algorithm is based on the iterative contraction of vertex pairs; a generalization of the iterative edge contraction technique used in previous work....

    [...]

Proceedings ArticleDOI
Hugues Hoppe1
01 Aug 1996
TL;DR: The progressive mesh (PM) representation is introduced, a new scheme for storing and transmitting arbitrary triangle meshes that addresses several practical problems in graphics: smooth geomorphing of level-of-detail approximations, progressive transmission, mesh compression, and selective refinement.
Abstract: Highly detailed geometric models are rapidly becoming commonplace in computer graphics. These models, often represented as complex triangle meshes, challenge rendering performance, transmission bandwidth, and storage capacities. This paper introduces the progressive mesh (PM) representation, a new scheme for storing and transmitting arbitrary triangle meshes. This efficient, lossless, continuous-resolution representation addresses several practical problems in graphics: smooth geomorphing of level-of-detail approximations, progressive transmission, mesh compression, and selective refinement. In addition, we present a new mesh simplification procedure for constructing a PM representation from an arbitrary mesh. The goal of this optimization procedure is to preserve not just the geometry of the original mesh, but more importantly its overall appearance as defined by its discrete and scalar appearance attributes such as material identifiers, color values, normals, and texture coordinates. We demonstrate construction of the PM representation and its applications using several practical models

3,206 citations

Proceedings ArticleDOI
01 Jul 1992
TL;DR: A general method for automatic reconstruction of accurate, concise, piecewise smooth surfaces from unorganized 3D points that is able to automatically infer the topological type of the surface, its geometry, and the presence and location of features such as boundaries, creases, and corners.
Abstract: This thesis describes a general method for automatic reconstruction of accurate, concise, piecewise smooth surfaces from unorganized 3D points. Instances of surface reconstruction arise in numerous scientific and engineering applications, including reverse-engineering--the automatic generation of CAD models from physical objects. Previous surface reconstruction methods have typically required additional knowledge, such as structure in the data, known surface genus, or orientation information. In contrast, the method outlined in this thesis requires only the 3D coordinates of the data points. From the data, the method is able to automatically infer the topological type of the surface, its geometry, and the presence and location of features such as boundaries, creases, and corners. The reconstruction method has three major phases: (1) initial surface estimation, (2) mesh optimization, and (3) piecewise smooth surface optimization. A key ingredient in phase 3, and another principal contribution of this thesis, is the introduction of a new class of piecewise smooth representations based on subdivision. The effectiveness of the three-phase reconstruction method is demonstrated on a number of examples using both simulated and real data. Phases 2 and 3 of the surface reconstruction method can also be used to approximate existing surface models. By casting surface approximation as a global optimization problem with an energy function that directly measures deviation of the approximation from the original surface, models are obtained that exhibit excellent accuracy to conciseness trade-offs. Examples of piecewise linear and piecewise smooth approximations are generated for various surfaces, including meshes, NURBS surfaces, CSG models, and implicit surfaces.

3,119 citations


Cites background from "Decimation of triangle meshes"

  • ...[60] is to simplify meshes generated by “marching cubes” that may consist of millions of triangles....

    [...]

  • ...[60], Turk [71], Rossignac and Borrel [53], and Lounsbery t al....

    [...]

Proceedings ArticleDOI
Gabriel Taubin1
15 Sep 1995
TL;DR: A very simple surface signal low-pass filter algorithm that applies to surfaces of arbitrary topology that is a linear time and space complexity algorithm and a very effective fair surface design technique.
Abstract: In this paper we describe a new tool for interactive free-form fair surface design. By generalizing classical discrete Fourier analysis to two-dimensional discrete surface signals – functions defined on polyhedral surfaces of arbitrary topology –, we reduce the problem of surface smoothing, or fairing, to low-pass filtering. We describe a very simple surface signal low-pass filter algorithm that applies to surfaces of arbitrary topology. As opposed to other existing optimization-based fairing methods, which are computationally more expensive, this is a linear time and space complexity algorithm. With this algorithm, fairing very large surfaces, such as those obtained from volumetric medical data, becomes affordable. By combining this algorithm with surface subdivision methods we obtain a very effective fair surface design technique. We then extend the analysis, and modify the algorithm accordingly, to accommodate different types of constraints. Some constraints can be imposed without any modification of the algorithm, while others require the solution of a small associated linear system of equations. In particular, vertex location constraints, vertex normal constraints, and surface normal discontinuities across curves embedded in the surface, can be imposed with this technique. CR

2,004 citations


Cites methods from "Decimation of triangle meshes"

  • ...Unless these large surfaces are first simplified [29, 13, 11], or re-meshed using far fewer faces [35], methods based on patch technology, whether parametric [28, 22, 10, 20, 19] or implicit [1, 23], are not acceptable either....

    [...]

Journal ArticleDOI
TL;DR: Metro allows one to compare the difference between a pair of surfaces by adopting a surface sampling approach, and returns both numerical results and visual results, by coloring the input surface according to the approximation error.
Abstract: This paper presents a new tool, Metro, designed to compensate for a deficiency in many simplification methods proposed in literature. Metro allows one to compare the difference between a pair of surfaces (e.g. a triangulated mesh and its simplified representation) by adopting a surface sampling approach. It has been designed as a highly general tool, and it does no assuption on the particular approach used to build the simplified representation. It returns both numerical results (meshes areas and volumes, maximum and mean error, etc.) and visual results, by coloring the input surface according to the approximation error. EMAIL:: r.scopigno@cnuce.cnr.it

1,585 citations

References
More filters
Proceedings ArticleDOI
01 Aug 1987
TL;DR: In this paper, a divide-and-conquer approach is used to generate inter-slice connectivity, and then a case table is created to define triangle topology using linear interpolation.
Abstract: We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divide-and-conquer approach to generate inter-slice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical data in scan-line order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models. The detail in images produced from the generated surface models is the result of maintaining the inter-slice connectivity, surface data, and gradient information present in the original 3D data. Results from computed tomography (CT), magnetic resonance (MR), and single-photon emission computed tomography (SPECT) illustrate the quality and functionality of marching cubes. We also discuss improvements that decrease processing time and add solid modeling capabilities.

13,231 citations

01 Jan 1985
TL;DR: This book offers a coherent treatment, at the graduate textbook level, of the field that has come to be known in the last decade or so as computational geometry.
Abstract: From the reviews: "This book offers a coherent treatment, at the graduate textbook level, of the field that has come to be known in the last decade or so as computational geometry...The book is well organized and lucidly written; a timely contribution by two founders of the field. It clearly demonstrates that computational geometry in the plane is now a fairly well-understood branch of computer science and mathematics. It also points the way to the solution of the more challenging problems in dimensions higher than two."

6,525 citations

Book
02 Jan 1991

1,377 citations

Proceedings ArticleDOI
01 Jul 1992
TL;DR: This paper shows how a new set of vertices can be distributed over the surface of a model and connected to one another to create a re-tiling of a surface that is faithful to both the geometry and the topology of the original surface.
Abstract: This paper presents an automatic method of creating surface models at several levels of detail from an original polygonal description of a given object. Representing models at various levels of detail is important for achieving high frame rates in interactive graphics applications and also for speeding-up the off-line rendering of complex scenes. Unfortunately, generating these levels of detail is a time-consuming task usually left to a human modeler. This paper shows how a new set of vertices can be distributed over the surface of a model and connected to one another to create a re-tiling of a surface that is faithful to both the geometry and the topology of the original surface. The main contributions of this paper are: 1) a robust method of connecting together new vertices over a surface, 2) a way of using an estimate of surface curvature to distribute more new vertices at regions of higher curvature and 3) a method of smoothly interpolating between models that represent the same object at different levels of detail. The key notion in the re-tiling procedure is the creation of an intermediate model called the mutual tessellation of a surface that contains both the vertices from the original model and the new points that are to become vertices in the re-tiled surface. The new model is then created by removing each original vertex and locally re-triangulating the surface in a way that matches the local connectedness of the initial surface. This technique for surface retessellation has been successfully applied to iso-surface models derived from volume data, Connolly surface molecular models and a tessellation of a minimal surface of interest to mathematicians.

923 citations


"Decimation of triangle meshes" refers background in this paper

  • ...The three steps of the algorithm are: 1. characterize the local vertex geometry and topology, 2. evaluate the decimation criteria, and 3. triangulate the resulting hole....

    [...]

Journal ArticleDOI
Jules Bloomenthal1
TL;DR: A numerical technique that approximates an implicit surface with a polygonal representation, so that the roots to the function need not be solved each time the surface is rendered.

532 citations

Frequently Asked Questions (12)
Q1. What contributions have the authors mentioned in the paper "Decimation of triangle meshes" ?

This paper describes an application independent algorithm that uses local operations on geometry and topology to reduce the number of triangles in a triangle mesh. After describing other work related to model creation from sampled data, the authors describe the triangle decimation process and its implementation. 

A loop of three ver-daverage planed boundarytices forms a triangle, that may be added to the mesh, and terminates the recursion process. 

Usually the termination criterion is specified as a percent reduction of the original mesh (or equivalent), or as some maximum decimation value. 

A simple vertex is surrounded by a complete cycle ofSimple Complex Boundary Interior EdgeCornertriangles, and each edge that uses the vertex is used by exactly two triangles. 

The first application applies the decimation algorithm to isosurfaces created from medical and industrial computed tomography scanners. 

That is, if every point in a candidate loop is on one side of the split plane, then the two loops do not overlap and the split plane is acceptable. 

If the edge is not used by two triangles, or if the vertex is used by a triangle not in the cycle of triangles, then the vertex is complex. 

The Mars example is an appropriate test because the authors had access to sub-sampled resolution data that could be compared with the decimated models. 

In both examples the authors generated an initial mesh by creating two triangles for each uniform quadrilateral element in the sampled data. 

From the Euler relation it follows that removal of a simple, corner, or interior edge vertex reduces the mesh by precisely two triangles. 

The three steps of the algorithm are: 1. characterize the local vertex geometry and topology, 2. evaluate the decimation criteria, and 3. triangulate the resulting hole. 

During a pass, each vertex is a candidate for removal and, if it meets the specified decimation criteria, the vertex and all triangles that use the vertex are deleted.