scispace - formally typeset
Search or ask a question

Showing papers in "ACM Transactions on Graphics in 2000"


Journal ArticleDOI
TL;DR: This work introduces the notion of image-driven simplification, a framework that uses images to decide which portions of a model to simplify, a departure from approaches that make polygonal simplification decisions based on geometry.
Abstract: We introduce the notion of image-driven simplification, a framework that uses images to decide which portions of a model to simplify. This is a departure from approaches that make polygonal simplification decisions based on geometry. As with many methods, we use the edge collapse operator to make incremental changes to a model. Unique to our approach, however, is the use at comparisons between images of the original model against those of a simplified model to determine the cost of an ease collapse. We use common graphics rendering hardware to accelerate the creation of the required images. As expected, this method produces models that are close to the original model according to image differences. Perhaps more surprising, however, is that the method yields models that have high geometric fidelity as well. Our approach also solves the quandary of how to weight the geometric distance versus appearance properties such as normals, color, and texture. All of these trade-offs are balanced by the image metric. Benefits of this approach include high fidelity silhouettes, extreme simplification of hidden portions of a model, attention to shading interpolation effects, and simplification that is sensitive to the content of a texture. In order to better preserve the appearance of textured models, we introduce a novel technique for assigning texture coordinates to the new vertices of the mesh. This method is based on a geometric heuristic that can be integrated with any edge collapse algorithm to produce high quality textured surfaces.

265 citations


Journal ArticleDOI
TL;DR: A hybird of the shadow map and shadow volume approaches which does not have these difficulties and leverages high-performance polygon rendering and a template-based reconstruction scheme simplifies capping the shadow volume after the near plane clip.
Abstract: Current graphics hardware can be used to generate shadows using either the shadow volume or shadow map techniques. However, the shadow volume technique requires access to a representation of the scence as a polygonal model, and handling the near plane clip correctly and efficiently is difficult; conversely, accurate shadow maps require high-precision texture map data representations, but these are not widely supported.We present a hybird of the shadow map and shadow volume approaches which does not have these difficulties and leverages high-performance polygon rendering. The scene is rendered from the point of view of the light source and a sampled depth map is recovered. Edge detection and a template-based reconstruction technique are used to generate a global shadow volume boundary surface, after which the pixels in shadow can be marked using only a one-bit stencil buffer and a single-pass rendering of the shadow volume boundary polygons. The simple form of our template-based reconstruction scheme simplifies capping the shadow volume after the near plane clip.

116 citations


Journal ArticleDOI
TL;DR: Techniques are presented for navigating between adjacent triangles of greater or equal size in a hierarchical triangle mesh where the triangles are obtained by a recursive quadtree-like subdivision of the underlying space into four equilateral triangles, suitable for finite element analysis, ray tracing, and the modeling of spherical data.
Abstract: Techniques are presented for navigating between adjacent triangles of greater or equal size in a hierarchical triangle mesh where the triangles are obtained by a recursive quadtree-like subdivision of the underlying space into four equilateral triangles These techniques are useful in a number of applications, including finite element analysis, ray tracing, and the modeling of spherical data The operations are implemented in a manner analogous to that used in a quadtree representation of data on the two-dimensional plane where the underlying space is tessellated into a square mesh A new technique is described for labeling the triangles, which is useful in implementing the quadtree triangle mesh as a linear quadtree (ie, a pointer-less quadtree); the navigation can then take place in this linear quadtree When the neighbors are of equal size, the algorithms have a worst-case constant time complexity The algorithms are very efficient, as they make use of just a few bit manipulation operations, and can be implemented in hardware using just a few machine language instructions The use of these techniques when modeling spherical data by projecting it onto the faces of a regular solid whose faces are equilateral triangles, which are represented as quadtree triangle meshes, is discussed in detail The methods are applicable to the icosahedron, octahedron, and tetrahedron The difference lies in the way transitions are made between the faces of the polyhedron However, regardless of the type of polyhedron, the computational complexity of the methods is the same

102 citations


Journal ArticleDOI
TL;DR: The proposed illumination technique is novel, providing intermediate image solutions of high quality at unprecedented speeds, even for complex scenes, and makes possible the development of a more robust adaptive mesh subdivision, which is guided by local contrast information.
Abstract: A novel view-independent technique for progressive global illumination computing that uses prediction of visible differences to improve both efficiency and effectiveness of physically-sound lighting solutions has been developed. The technique is a mixture of stochastic (density estimation) and deterministic (adaptive mesh refinement) algorithms used in a sequence and optimized to reduce the differences between the intermediate and final images as perceived by the human observer in the course of lighting computation. The quantitive measurements of visibility were obtained using the model of human vision captured in the visible differences predictor (VDP) developed by Daly [1993]. The VDP responses were used to support the selection of the best component algorithms from a pool of global illumination solutions, and to enhance the selected algorithms for even better progressive refinement of image quality. The VDP was also used to determine the optimal sequential order of component-algorithm execution, and to choose the points at which switchover between algorithms should take place. As the VDP is computationally expensive, it was applied exclusively at the design and tuning stage of the composite technique, and so perceptual considerations are embedded into the resulting solution, though no VDP calculations were performed during lighting simulation.The proposed illumination technique is also novel, providing intermediate image solutions of high quality at unprecedented speeds, even for complex scenes. One advantage of the technique is that local estimates of global illumination are readily available at the early stages of computing, making possible the development of a more robust adaptive mesh subdivision, which is guided by local contrast information. Efficient object space filtering, also based on stochastically-derived estimates of the local illumination error, is applied to substantially reduce the visible noise inherent in stochastic solutions.

74 citations


Journal ArticleDOI
TL;DR: In this paper, an algorithm for partitioning the points into subsets and fitting a parametric curve to each subset is described, where the points could be measurements from a physical phenomenon, and the objective in this process could be to find patterns among the points and describe the phenomenon analytically.
Abstract: Given a large set of irregularly spaced points in the plane, an algorithm for partitioning the points into subsets and fitting a parametric curve to each subset is described. The points could be measurements from a physical phenomenon, and the objective in this process could be to find patterns among the points and describe the phenomenon analytically. The points could be measurements from a geometric model, and the objective could be to reconstruct the model by a combination of parametric curves. The algorithm proposed here can be used in various applications, especially where given points are dense and noisy. Examples demonstrating the behavior of the algorithm under noise and density of the points are presented and discussed.

62 citations


Journal ArticleDOI
TL;DR: The perturbation formula presented here holds for general multiple-bounce reflection paths and provides a mathematical foundation for exploiting path coherence in ray tracing acceleration techniques and incremental rendering.
Abstract: In this paper we apply perturbation methods to the problem of computing specular reflections in curved surfaces. The key idea is to generate families of closely related optical paths by expanding a given path into a high-dimensional Taylor series. Our path perturbation method is based on closed-form expressions for linear and higher-order approximations of ray paths, which are derived using Fermat's Variation Principle and the Implicit Function Theorem (IFT). The perturbation formula presented here holds for general multiple-bounce reflection paths and provides a mathematical foundation for exploiting path coherence in ray tracing acceleration techniques and incremental rendering. To illustrate its use, we describe an algorithm for fast approximation of specular reflections on curved surfaces; the resulting images are highly accurate and nearly indistinguishable from ray traced images.

58 citations


Journal Article
TL;DR: A refined version of the texture potential mapping algorithm is introduced in which a one-dimensional MIP map is incorporated, which has the effect of controlling the maximum number of texture samples required.
Abstract: A refined version of the texture potential mapping algorithm is introduced in which a one-dimensional MIP map is incorporated. This has the effect of controlling the maximum number of texture samples required. The new technique is compared to existing texture antialiasing methods in terms of quality and sample count. The new method is shown to compare favorably with existing techniques for producing high quality antialiased, texture-mapped images.

14 citations


Journal ArticleDOI
Ramon F. Sarraga1
TL;DR: An alternative G-smooth surface construction, inspired by the theory of manifolds, that is subject to fewer application constraints than approaches found in the technical literature is explored, and imposes no artificial restrictions on the tangents of patch boundary curves at vertex points.
Abstract: This article presents a method for constructing a G1-smooth surface, composed of independently parametrized triangular polynomial Bezier patches, to fit scattered data points triangulated in R3 with arbitrary topology. The method includes a variational technique to optimize the shape of the surface. A systematic development of the method is given, presenting general equations provided by the theory of manifolds, explaining the heuristic assumptions made to simplify calculations, and analyzing the numerical results obtained from fitting two test configurations of scattered data points. The goal of this work is to explore an alternative G3 construction, inspired by the theory of manifolds, that is subject to fewer application constraints than approaches found in the technical literature; e.g., this approach imposes no artificial restrictions on the tangents of patch boundary curves at vertex points of a G1 surface. The constructed surface shapes fit all test data surprisingly well for for a noniterative method based on polynomial patches.

14 citations


Journal ArticleDOI
TL;DR: A unified methodology to tackle geometry processing operations admitting explicit algebraic expressions based on representing and manipulating polynomials algebraically in a recently basis, the symmetric analogue of the power form, so called because it is associated with a “Hermite two-point expansion” instead of a Taylor expansion.
Abstract: We propose a unified methodology to tackle geometry processing operations admitting explicit algebraic expressions. This new approach is based on representing and manipulating polynomials algebraically in a recently basis, the symmetric analogue of the power form (s-power basis for brevity), so called because it is associated with a “Hermite two-point expansion” instead of a Taylor expansion. Given the expression of a polynomial in this basis over the unit interval u e[0, 1], degree reduction is trivally obtained by truncation, which yields the He many terms as desired of the corresponding Hermite interpolant and build “s-power series,” akin to Taylor series. Applications include computing integral approximations of rational polynomials, or approximations of offset curves.

12 citations


Journal ArticleDOI
TL;DR: This paper presents a method for determining a priori a constant parameter interval for tessellating a rational curve orsurface such that the deviation of the curve or surface from its piecewise linear approximation is within a specified tolerance.
Abstract: This paper presents a method for determining a priori a constant parameter interval for tessellating a rational curve or surface such that the deviation of the curve or surface from its piecewise linear approximation is within a specified tolerance. The parameter interval is estimated based on information about second-order derivatives in the homogeneous coordinates, instead of using affine coordinates directly. This new step size can be found with roughly the same amount of computation as the step size in Cheng [1992], though it can be proven to always be larger than Cheng's step size. In fact, numerical experiments show the new step is typically orders of magnitude larger than the step size in Cheng [1992]. Furthermore, for rational cubic and quartic curves, the new step size is generally twice as large as the step size found by computing bounds on the Bernstein polynomial coefficients of the second derivatives function.

12 citations


Journal ArticleDOI
TL;DR: A new hierarchical algorithm is proposed in which partial visibility maps can be computed on the fly, using a convolution technique for emitter-receiver configurations where complex shadows are produced.
Abstract: Lighting simulations using hierarchical radiosity with clustering can be very slow when the computation of fine and artifact-free shadows is needed. To avoid the high cost of mesh refinement associated with fast variations of visibility across receivers, we propose a new hierarchical algorithm in which partial visibility maps can be computed on the fly, using a convolution technique for emitter-receiver configurations where complex shadows are produced. Other configurations still rely on mesh subdivision to reach the desired accuracy in modeling energy transfer. In our system, therefore, radiosity is represented as a combination of textures and piecewise-constant or linear contributions over mesh elements at multiple hierarchical levels. We give a detailed description of the gather, push/pull, and display stages of the hierarchical radiosity algorithm, adapted to seamlessly integrate both representations. A new refinement algorithm is proposed, which chooses the most appropriate technique to compute the energy transfer and resulting radiosity distribution for each receiver/transmitter configuration. Comprehensive error control is achieved by subdividing either the source or receiver in a traditional manner, or by using a blocker subdivision scheme that improves the quality of shadow masks without increasing the complexity of the mesh. Results show that high-quality images are obtained in a matter of seconds for scenes with tens of thousands of polygons.

Journal ArticleDOI
TL;DR: An appreciation of Alain Fournier, who wrote wonderful mathematics, algorithms, prose and poetry, and made us think about the limits of both fields, and his approach to solving problems was at once courageous and rational.
Abstract: Alain Fournier, 1943-2000An AppreciationThere are thinkers of great repute and intellect who would suggest that any objective measure of humankind is in fact a mismeasure. While I am not of a mood to argue the general point either way, it certainly applies to Alain Fournier. I write this appreciation of him so that those who do not know him may be inspired to discover him, and so that those who do know him are able to reflect further on his remarkable life. I beg the reader to indulge me in a rather personal reverie, for it is not possible to have known Alain without having a deep personal response to him.Allow me to recount very briefly Alain's accomplishments. His early training was in chemistry. After emigrating from France to Canada in the 1970's, he co-wrote a textbook on the topic, and he taught pre-college chemistry in Quebec. His career in computer graphics spanned only about 20 years. He received a Ph.D. in computer science at the University of Texas at Dallas, and reported the results of his Ph.D. work on stochastic modelling in a seminal paper in 1980 with Don Fussell and Loren Carpenter. He then went on to an outstanding academic career first at the University of Toronto and subsequently at the University of British Columbia. >From the outset he played on the international stage, especially in Europe and in North America. He has contributed to ACM-TOG as an author, as co-Guest Editor of a special issue (1987), and he was an Associate Editor from 1990-1992.Alain's early contributions to computer graphics on the modelling of natural phenomena were brilliant in themselves, but perhaps more importantly they advocated a methodology that required validation against real visual phenomena. This set the bar at the right level scientifically. His approach, which he once called "impressionistic graphics" both revolutionised the field and drove it forward. Perhaps the best example of this work is his beautiful paper on the depiction of ocean waves with Bill Reeves. His subsequent work on illumination models, light transport, rendering, and sampling and filtering is remarkable for its far-sightedness and depth. His theoretical work in computer graphics and computational geometry made us think about the limits of both fields.Alain's approach to solving problems was at once courageous and rational. Were he to ask himself (as I'm sure he did) for a response to T.S. Eliot's musing,I'm sure his answer would be his inimitable "Piece of cake! OK kiddos, here's what we'll do." And he would rush forward, leaving us to linger happily in his slipstream. Endlessly resourceful and tirelessly innovative, he would mould ideas of amazing insight into work that also inspired others, often much later, to take up the challenge.If C.P. Snow were ever in need of a prototypical person to bridge the "Two Cultures" of Science and Art, Alain would be it. He was blessed with an irrepressible enthusiasm to communicate his understanding and his curiosity about the universe, and he did so in whatever language was most appropriate. He wrote wonderful mathematics, algorithms, prose and poetry. His vocabulary in English and in French was gently intimidating, for even in intimidation he was benevolent. It seemed that his intellect was able to synthesise everything he ever learned. He would routinely interject a Latin "bon mot" into the papers we were writing or practise writing Kanji on the napkins on which we were doing research. We rarely did research in an office. How I miss those days.His art served him as innately and intuitively as did his science. He wrote exquisite poetry that was both challenging and tender. One day I hope that his work will be more available to the general public. I also hope that Alain's accomplishments will in time be formally recognised by the wider computer graphics community.Alain's wit, his innate "jeu d'esprit", was legend. His fondness for good jokes, especially Groucho Marx gags, allowed some but not all of us to overlook his weakness for Jerry Lewis.There are few on this earth who have been blessed with a wider array of talents, and fewer still who had more to say and more to contribute than Alain. And yet, Alain had a sensibility that is common to scientists and artists who have done many great things in their lives: apart from a lovely retrospective paper he wrote for Graphics Interface in 1994, he did not look back sympathetically on his work to derive satisfaction from his accomplishments. He was rooted in the present, and he suffered from the belief that he was only as good as his last project. In the end he may well have believed in Eliot's sentiment:I have heard the mermaids singing, each to each. I do not think that they will sing to me.Oh, but they have always sung to you, Alain. It was just that the melody was lost amid the clamour of disease.Alain did not separate the personal and professional. His were passions that required no qualifying adjective. He loved those close to him with an abandon and devotion that that was disarming and humbling. Anyone who knew him knew that he was extraordinarily close to his wife and his daughter. They were his greatest joys, his most provocative muses. They were the foundation upon which he built his life.Leonard Cohen, among others, said that you can't let the facts get in the way of the truth. The facts are that Alain Fournier, a great innovator in our field, died of lymphoma in the early hours of 14 August, 2000, and is survived by his daughter Ariel, his wife Adrienne, and a legion of admirers. The truth is that the sun seems to shine less brightly now that he is not among us. The truth is that he has broadened the minds and moved the hearts of many people around the world. To those who have never known Alain, I express particular sympathy, for there are few people we encounter in life with the ability to make us better than we thought we could be. Such, in truth and in fact, is the measure of this man.It was difficult not to love Alain. With such a beguiling package of brilliance and benevolent eccentricity, it was simply a matter of time and a question of degree. A unique and wonderful person has left us.Requiescat in pace, my friend.--Eugene Fiume, November 2000. (The quotations from T.S. Eliot are taken from his poem, "The Love Song of J. Alfred Prufrock".)