scispace - formally typeset
Search or ask a question
Author

Peter Schröder

Bio: Peter Schröder is an academic researcher from California Institute of Technology. The author has contributed to research in topics: Polygon mesh & Subdivision surface. The author has an hindex of 61, co-authored 133 publications receiving 19201 citations. Previous affiliations of Peter Schröder include Bell Labs & University of South Carolina.


Papers
More filters
Book ChapterDOI
01 Jan 2003
TL;DR: A unified and consistent set of flexible tools to approximate important geometric attributes, including normal vectors and curvatures on arbitrary triangle meshes, using averaging Voronoi cells and the mixed Finite-Element/Finite-Volume method is proposed.
Abstract: This paper proposes a unified and consistent set of flexible tools to approximate important geometric attributes, including normal vectors and curvatures on arbitrary triangle meshes. We present a consistent derivation of these first and second order differential properties using averaging Voronoi cells and the mixed Finite-Element/Finite-Volume method, and compare them to existing formulations. Building upon previous work in discrete geometry, these operators are closely related to the continuous case, guaranteeing an appropriate extension from the continuous to the discrete setting: they respect most intrinsic properties of the continuous differential operators. We show that these estimates are optimal in accuracy under mild smoothness conditions, and demonstrate their numerical quality. We also present applications of these operators, such as mesh smoothing, enhancement, and quality checking, and show results of denoising in higher dimensions, such as for tensor images.

2,003 citations

Proceedings ArticleDOI
01 Jul 1999
TL;DR: Methods to rapidly remove rough features from irregularly triangulated data intended to portray a smooth surface are developed and it is proved that these curvature and Laplacian operators have several mathematically-desirable qualities that improve the appearance of the resulting surface.
Abstract: In this paper, we develop methods to rapidly remove rough features from irregularly triangulated data intended to portray a smooth surface. The main task is to remove undesirable noise and uneven edges while retaining desirable geometric features. The problem arises mainly when creating high-fidelity computer graphics objects using imperfectly-measured data from the real world. Our approach contains three novel features: an implicit integration method to achieve efficiency, stability, and large time-steps; a scale-dependent Laplacian operator to improve the diffusion process; and finally, a robust curvature flow operator that achieves a smoothing of the shape itself, distinct from any parameterization. Additional features of the algorithm include automatic exact volume preservation, and hard and soft constraints on the positions of the points in the mesh. We compare our method to previous operators and related algorithms, and prove that our curvature and Laplacian operators have several mathematically-desirable qualities that improve the appearance of the resulting surface. In consequence, the user can easily select the appropriate operator according to the desired type of fairing. Finally, we provide a series of examples to graphically and numerically demonstrate the quality of our results.

1,651 citations

Proceedings ArticleDOI
01 Jul 2003
TL;DR: This work implemented two basic, broadly useful, computational kernels: a sparse matrix conjugate gradient solver and a regular-grid multigrid solver for high-intensity numerical simulation of geometric flow and fluid simulation on the GPU.
Abstract: Many computer graphics applications require high-intensity numerical simulation. We show that such computations can be performed efficiently on the GPU, which we regard as a full function streaming processor with high floating-point performance. We implemented two basic, broadly useful, computational kernels: a sparse matrix conjugate gradient solver and a regular-grid multigrid solver. Real time applications ranging from mesh smoothing and parameterization to fluid solvers and solid mechanics can greatly benefit from these, evidence our example applications of geometric flow and fluid simulation running on NVIDIA's GeForce FX.

870 citations

Proceedings ArticleDOI
15 Sep 1995
TL;DR: This paper shows how biorthogonal wavelets with custom properties can be constructed with the lifting scheme, and gives examples of functions defined on the sphere, and shows how they can be efficiently represented with spherical wavelets.
Abstract: Wavelets have proven to be powerful bases for use in numerical analysis and signal processing. Their power lies in the fact that they only require a small number of coefficients to represent general functions and large data sets accurately. This allows compression and efficient computations. Classical constructions have been limited to simple domains such as intervals and rectangles. In this paper we present a wavelet construction for scalar functions defined on the sphere. We show how biorthogonal wavelets with custom properties can be constructed with the lifting scheme. The bases are extremely easy to implement and allow fully adaptive subdivisions. We give examples of functions defined on the sphere, such as topographic data, bidirectional reflection distribution functions, and illumination, and show how they can be efficiently represented with spherical wavelets. CR

766 citations

Proceedings ArticleDOI
24 Jul 1998
TL;DR: An irregular connectivity mesh representative of a surface having an arbitrary topology is processed to generate a parameterization which maps points in a coarse base domain to points in the mesh, such that the original mesh can be reconstructed from the base domain and the parameterization.
Abstract: An irregular connectivity mesh representative of a surface having an arbitrary topology is processed to generate a parameterization which maps points in a coarse base domain to points in the mesh. An illustrative embodiment uses a multi-level mesh simplification process in conjunction with conformal mapping to efficiently construct a parameterization of a mesh comprising a large number of triangles over a base domain comprising a smaller number of triangles. The parameterization in this embodiment corresponds to the inverse of function mapping each point in the original mesh to one of the triangles of the base domain, such that the original mesh can be reconstructed from the base domain and the parameterization. The mapping function is generated as a combination of a number of sub-functions, each of which relates data points in a mesh of one level in a simplification hierarchy to data points in a mesh of the next coarser level of the simplification hierarchy. The parameterization can also be used to construct, from the original irregular connectivity mesh, an adaptive remesh having a regular connectivity which is substantially easier to process than the original mesh.

750 citations


Cited by
More filters
Book
01 Jan 1998
TL;DR: An introduction to a Transient World and an Approximation Tour of Wavelet Packet and Local Cosine Bases.
Abstract: Introduction to a Transient World. Fourier Kingdom. Discrete Revolution. Time Meets Frequency. Frames. Wavelet Zoom. Wavelet Bases. Wavelet Packet and Local Cosine Bases. An Approximation Tour. Estimations are Approximations. Transform Coding. Appendix A: Mathematical Complements. Appendix B: Software Toolboxes.

17,693 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: In this article, the concept of isogeometric analysis is proposed and the basis functions generated from NURBS (Non-Uniform Rational B-Splines) are employed to construct an exact geometric model.

5,137 citations

Posted Content
TL;DR: A hierarchical neural network that applies PointNet recursively on a nested partitioning of the input point set and proposes novel set learning layers to adaptively combine features from multiple scales to learn deep point set features efficiently and robustly.
Abstract: Few prior works study deep learning on point sets. PointNet by Qi et al. is a pioneer in this direction. However, by design PointNet does not capture local structures induced by the metric space points live in, limiting its ability to recognize fine-grained patterns and generalizability to complex scenes. In this work, we introduce a hierarchical neural network that applies PointNet recursively on a nested partitioning of the input point set. By exploiting metric space distances, our network is able to learn local features with increasing contextual scales. With further observation that point sets are usually sampled with varying densities, which results in greatly decreased performance for networks trained on uniform densities, we propose novel set learning layers to adaptively combine features from multiple scales. Experiments show that our network called PointNet++ is able to learn deep point set features efficiently and robustly. In particular, results significantly better than state-of-the-art have been obtained on challenging benchmarks of 3D point clouds.

4,802 citations

Book
30 Sep 2010
TL;DR: Computer Vision: Algorithms and Applications explores the variety of techniques commonly used to analyze and interpret images and takes a scientific approach to basic vision problems, formulating physical models of the imaging process before inverting them to produce descriptions of a scene.
Abstract: Humans perceive the three-dimensional structure of the world with apparent ease. However, despite all of the recent advances in computer vision research, the dream of having a computer interpret an image at the same level as a two-year old remains elusive. Why is computer vision such a challenging problem and what is the current state of the art? Computer Vision: Algorithms and Applications explores the variety of techniques commonly used to analyze and interpret images. It also describes challenging real-world applications where vision is being successfully used, both for specialized applications such as medical imaging, and for fun, consumer-level tasks such as image editing and stitching, which students can apply to their own personal photos and videos. More than just a source of recipes, this exceptionally authoritative and comprehensive textbook/reference also takes a scientific approach to basic vision problems, formulating physical models of the imaging process before inverting them to produce descriptions of a scene. These problems are also analyzed using statistical models and solved using rigorous engineering techniques Topics and features: structured to support active curricula and project-oriented courses, with tips in the Introduction for using the book in a variety of customized courses; presents exercises at the end of each chapter with a heavy emphasis on testing algorithms and containing numerous suggestions for small mid-term projects; provides additional material and more detailed mathematical topics in the Appendices, which cover linear algebra, numerical techniques, and Bayesian estimation theory; suggests additional reading at the end of each chapter, including the latest research in each sub-field, in addition to a full Bibliography at the end of the book; supplies supplementary course material for students at the associated website, http://szeliski.org/Book/. Suitable for an upper-level undergraduate or graduate-level course in computer science or engineering, this textbook focuses on basic techniques that work under real-world conditions and encourages students to push their creative boundaries. Its design and exposition also make it eminently suitable as a unique reference to the fundamental techniques and current research literature in computer vision.

4,146 citations