Journal•
Graphics gems
About: Graphics gems is an academic journal. The journal publishes majorly in the area(s): Intersection & Line–plane intersection. Over the lifetime, 142 publications have been published receiving 6453 citations.
Papers
More filters
2,671 citations
TL;DR: In this gem, software is presented that performs spatial partitioning and polygonal approximation of implicit surfaces and it is hoped this software, which includes a simple test application, will encourage experimentation with implicit surface modeling.
Abstract: An algorithm for the polygonization of implicit surfaces is described and an implementation in C is provided. The discussion reviews implicit surface polygonization, and compares various methods. Introduction Some shapes are more readily defined by implicit, rather than parametric, techniques. For example, consider a sphere centered at C with radius r. It can be described parametrically as {P}, where: The implicit definition for the same sphere is more compact: (P x-C x) 2 +(P y-C y) 2 +(P z-C z) 2-r 2 = 0. Because an implicit representation does not produce points by substitution, root-finding must be employed to render its surface. One such method is ray tracing, which can generate excellent images of implicit surfaces. Alternatively, an image of the function (not surface) can be created with volume rendering. Polygonization is a method whereby a polygonal (i.e., parametric) approximation to the implicit surface is created from the implicit surface function. This allows the surface to be rendered with conventional polygon renderers; it also permits non-imaging operations, such as positioning an object on the surface. Polygonization consists of two principal steps. First, space is partitioned into adjacent cells at whose corners the implicit surface function is evaluated; negative values are considered inside the surface, positive values outside. Second, within each cell, the intersections of cell edges with the implicit surface are connected to form one or more polygons. In this gem we present software that performs spatial partitioning and polygonal approximation. We hope this software, which includes a simple test application, will encourage experimentation with implicit surface modeling. The implementation is relatively simple (about 400 lines, ignoring comments, the test application, and the cubical polygonization option). Some of this simplicity derives from the use of the cube as the partitioning cell; its symmetries provide a simple means to compute and index corner locations. We do not employ automatic techniques (such as interval analysis) to set polygonization parameters (such as cell size); these are set by the user, who also must judge the success of the polygonization. This not only simplifies the implementation, but permits the implicit surface function to be treated as a 'black box.' The function, for example, can be procedural or, even, discontinuous (although discontinuous functions may produce undesirable results). The use of a fixed resolution (i.e., unchanging cell size) also simplifies the implementation, which explains the popularity of fixed resolution over adaptive resolution methods. 1 This makes …
358 citations
TL;DR: A new method for filling a color table is presented that produces pictures of similar quality as existing methods, but requires less memory and execution time.
Abstract: A new method for filling a color table is presented that produces pictures of similar quality as existing methods, but requires less memory and execution time. All colors of an image are inserted in an octree, and this octree is reduced from the leaves to the root in such a way that every pixel has a well defined maximum error. The algorithm is described in PASCAL notation.
347 citations
TL;DR: Signals are never perfectly bandlimited, nor can the authors construct a perfect reconstruction filter, but they can get as close as they want in a prescribed manner.
Abstract: Even though a signal is sampled, we may have certain rules about inferring the values between the sample points The most common assumption made in signal processing is that the signal is bandlimited to an extent consistent with the sampling rate, ie that the values change smoothly between samples The Sampling Theorem guarantees that a continuous signal can be reconstructed perfectly from its samples if the signal was appropriately bandlimited prior to sampling [Oppenheim 75] Practically speaking, signals are never perfectly bandlimited, nor can we construct a perfect reconstruction filter, but we can get as close as we want in a prescribed manner
302 citations
299 citations