scispace - formally typeset
Search or ask a question
Author

Herbert Edelsbrunner

Bio: Herbert Edelsbrunner is an academic researcher from Institute of Science and Technology Austria. The author has contributed to research in topics: Delaunay triangulation & Voronoi diagram. The author has an hindex of 84, co-authored 377 publications receiving 33877 citations. Previous affiliations of Herbert Edelsbrunner include University of Illinois at Urbana–Champaign & Duke University.


Papers
More filters
Posted Content
TL;DR: An algorithm for constructing a cell complex in space-time, called the medusa, that measures topological properties of the sorting process is described, which requires an extension of the kinetic data structures framework from Delaunay triangulations to fixed-radius alpha complexes.
Abstract: Motivated by an application in cell biology, we consider spatial sorting processes defined by particles moving from an initial to a final configuration. We describe an algorithm for constructing a cell complex in space-time, called the medusa, that measures topological properties of the sorting process. The algorithm requires an extension of the kinetic data structures framework from Delaunay triangulations to fixed-radius alpha complexes. We report on several techniques to accelerate the computation.

5 citations

Proceedings ArticleDOI
01 Jan 2018
TL;DR: Focusing on Bregman divergences to measure dissimilarity, this work proves bounds on the location of the center of a smallest enclosing sphere depend on the range of radii for which BRegman balls are convex.
Abstract: Smallest enclosing spheres of finite point sets are central to methods in topological data analysis. Focusing on Bregman divergences to measure dissimilarity, we prove bounds on the location of the center of a smallest enclosing sphere. These bounds depend on the range of radii for which Bregman balls are convex.

5 citations

Book ChapterDOI
11 Apr 1984
TL;DR: Several key-problems of the classical part of computational geometry which exhibit strong interrelations are presented, and a unified view of the problems is stressed, and the general ideas behind the methods that solve them are worked out.
Abstract: Computational geometry, considered a subfield of computer science, is concerned with the computational aspects of geometric problems. The increasing activity in this rather young field made it split into several reasonably independent subareas. This paper presents several key-problems of the classical part of computational geometry which exhibit strong interrelations. A unified view of the problems is stressed, and the general ideas behind the methods that solve them are worked out.

5 citations

01 Jan 2004
TL;DR: This thesis study and extract the topological features of the data and use them for visualization and develops visualization software that performs local comparison between pairs of functions in datasets containing multiple and sometimes time-varying functions.
Abstract: Scientists attempt to understand physical phenomena by studying various quantities measured over the region of interest. A majority of these quantities are scalar (real-valued) functions. These functions are typically studied using traditional visualization techniques like isosurface extraction, volume rendering etc. As the data grows in size and becomes increasingly complex, these techniques are no longer effective. State of the art visualization methods attempt to automatically extract features and annotate a display of the data with a visualization of its features. In this thesis, we study and extract the topological features of the data and use them for visualization. We have three results: (1) An algorithm that simplifies a scalar function defined over a tetrahedral mesh. In addition to minimizing the error introduced by the approximation of the function, the algorithm improves the mesh quality and preserves the topology of the domain. We perform an extensive set of experiments to study the effect of requiring better mesh quality on the approximation error and the level of simplification possible. We also study the effect of simplification on the topological features of the data. (2) An extension of three-dimensional Morse-Smale complexes to piecewise linear 3-manifolds and an efficient algorithm to compute its combinatorial analog. Morse-Smale complexes partition the domain into regions with similar gradient flows. Letting n be the number of vertices in the input mesh, the running time of the algorithm is proportional to n log(n) plus the total size of the input mesh plus the total size of the output. We develop a visualization tool that displays different substructures of the Morse-Smale complex. (3) A new comparison measure between k functions defined on a common d-manifold. For the case d = k = 2, we give alternative formulations of the definition based on a Morse theoretic point of view. We also develop visualization software that performs local comparison between pairs of functions in datasets containing multiple and sometimes time-varying functions. We apply our methods to data from medical imaging, electron microscopy, and x-ray crystallography. The results of these experiments provide evidence of the usability of our methods.

5 citations

01 Jan 2004
TL;DR: This thesis describes efficient computational methods for describing and comparing molecular structures by combining both geometric and topological approaches and describes an efficient algorithm to find promising initial relative placements of the proteins.
Abstract: With the recent success of the Human Genome Project, one of the main challenges in molecular biology in this post-genomic era is the determination and exploitation of the three-dimensional structure of proteins and their function. The ability for proteins to perform their numerous functions is made possible by the diversity of their three-dimensional structures. Hence, to attack the key problems involved, such as protein folding and docking, geometry and topology become important tools. Despite their essential roles, geometric and topological methods are relatively uncommon in computational biology, partly due to a number of modeling and algorithmic challenges. This thesis describes efficient computational methods for describing and comparing molecular structures by combining both geometric and topological approaches. In particular, in the first part of the thesis, we study three geometric descriptions: (i) the writhing number of protein backbones, which measures how many times a backbone coils around itself; (ii) the level-of-details representation of protein backbones via simplification, which helps to extract main features of backbones; and (iii) the elevation of molecular surfaces, which we propose to identify geometric features such as protrusions and cavities from protein surfaces. We develop efficient algorithms for computing these descriptions. The second part of the thesis focuses on molecular shape matching algorithms. By modeling a molecule as the union of balls, we propose algorithms to compute the similarity between two such unions by (variants of) the widely used Hausdorff distance. We also study the protein docking problem, which, from a geometric perspective, can be considered as the problem of searching for configurations with maximum complementarity between two molecular surfaces. Using the feature information computed from the elevation function, we describe an efficient algorithm to find promising initial relative placements of the proteins.

5 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The goals of the PDB are described, the systems in place for data deposition and access, how to obtain further information and plans for the future development of the resource are described.
Abstract: The Protein Data Bank (PDB; http://www.rcsb.org/pdb/ ) is the single worldwide archive of structural data of biological macromolecules. This paper describes the goals of the PDB, the systems in place for data deposition and access, how to obtain further information, and near-term plans for the future development of the resource.

34,239 citations

Book
08 Sep 2000
TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Abstract: The increasing volume of data in modern business and science calls for more complex and sophisticated tools. Although advances in data mining technology have made extensive data collection much easier, it's still always evolving and there is a constant need for new techniques and tools that can help us transform this data into useful information and knowledge. Since the previous edition's publication, great advances have been made in the field of data mining. Not only does the third of edition of Data Mining: Concepts and Techniques continue the tradition of equipping you with an understanding and application of the theory and practice of discovering patterns hidden in large data sets, it also focuses on new, important topics in the field: data warehouses and data cube technology, mining stream, mining social networks, and mining spatial, multimedia and other complex data. Each chapter is a stand-alone guide to a critical topic, presenting proven algorithms and sound implementations ready to be used directly or with strategic modification against live data. This is the resource you need if you want to apply today's most powerful data mining techniques to meet real business challenges. * Presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects. * Addresses advanced topics such as mining object-relational databases, spatial databases, multimedia databases, time-series databases, text databases, the World Wide Web, and applications in several fields. *Provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data

23,600 citations

Book
25 Oct 1999
TL;DR: This highly anticipated third edition of the most acclaimed work on data mining and machine learning will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.
Abstract: Data Mining: Practical Machine Learning Tools and Techniques offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This highly anticipated third edition of the most acclaimed work on data mining and machine learning will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining. Thorough updates reflect the technical changes and modernizations that have taken place in the field since the last edition, including new material on Data Transformations, Ensemble Learning, Massive Data Sets, Multi-instance Learning, plus a new version of the popular Weka machine learning software developed by the authors. Witten, Frank, and Hall include both tried-and-true techniques of today as well as methods at the leading edge of contemporary research. *Provides a thorough grounding in machine learning concepts as well as practical advice on applying the tools and techniques to your data mining projects *Offers concrete tips and techniques for performance improvement that work by transforming the input or output in machine learning methods *Includes downloadable Weka software toolkit, a collection of machine learning algorithms for data mining tasks-in an updated, interactive interface. Algorithms in toolkit cover: data pre-processing, classification, regression, clustering, association rules, visualization

20,196 citations

MonographDOI
01 Jan 2006
TL;DR: This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms, into planning under differential constraints that arise when automating the motions of virtually any mechanical system.
Abstract: Planning algorithms are impacting technical disciplines and industries around the world, including robotics, computer-aided design, manufacturing, computer graphics, aerospace applications, drug design, and protein folding. This coherent and comprehensive book unifies material from several sources, including robotics, control theory, artificial intelligence, and algorithms. The treatment is centered on robot motion planning but integrates material on planning in discrete spaces. A major part of the book is devoted to planning under uncertainty, including decision theory, Markov decision processes, and information spaces, which are the “configuration spaces” of all sensor-based planning problems. The last part of the book delves into planning under differential constraints that arise when automating the motions of virtually any mechanical system. Developed from courses taught by the author, the book is intended for students, engineers, and researchers in robotics, artificial intelligence, and control theory as well as computer graphics, algorithms, and computational biology.

6,340 citations