scispace - formally typeset
Search or ask a question

Showing papers by "Ivana Kolingerová published in 2007"


Journal ArticleDOI
TL;DR: An overview of existing stripisation methods and detailed description of several important stripification methods for fully triangulated meshes is presented.

14 citations


Proceedings ArticleDOI
26 Apr 2007
TL;DR: This paper introduces a general model for the real-time path planning in a dynamic environment and provides a hybrid technique that combines a graph and grid representation of the examined space and uses an adaptive mesh for its graph part to provide the capability of the assimilation to the changing environment.
Abstract: Solutions for the common problem of path planning in an abstract environment have been extensively studied in many scientific disciplines. However, many explored techniques assume the environment does not change and that there is a complete and detailed overview of this examined space. In addition, many path planning methods need to derive a specific graph structure from the environment representation and it can be often very difficult to obtain this structure in some real applications.In our paper, we introduce a general model for the real-time path planning in a dynamic environment and provide a hybrid technique that combines a graph and grid representation of the examined space. The proposed path planning method uses an adaptive mesh for its graph part to provide the capability of the assimilation to the changing environment.The presented method offers faster times for the path retrieval than the classical raster based approaches and works in a dynamic environment where the conventional graph based techniques fail. On the other hand, there are still some higher memory requirements of the proposed solution due to the necessary raster representation of the examined environment.

8 citations


20 Dec 2007
TL;DR: This work proposes a modification of the original data stream k-median clustering to solve facility location which is the case when the authors a priori do not know the number of clusters in the input data.
Abstract: Using recent knowledge in data stream clustering we present a modified approach to the facility location problem in the context of geometric data streams. We give insight to the existing algorithm from a less mathematical point of view, focusing on understanding and practical use, namely by computer graphics experts. We propose a modification of the original data stream k-median clustering to solve facility location which is the case when we a priori do not know the number of clusters in the input data. Like the original, the modified version is capable of processing millions of points while using rather small amount of memory. Based on our experiments with clustering geometric data we present suggestions on how to set processing parameters. We also describe how the algorithm handles various distributions of input data within the stream. These findings may be applied back to the original algorithm. CR Categories: I.5.3 [Computing Methodologies]: Pattern Recognition—Clustering; I.3.5 [Computing Methodologies]: Computer Graphics—Computational Geometry and Object Modeling

6 citations


Journal ArticleDOI
TL;DR: It is shown how the vertex normal can be computed efficiently for an arbitrary triangular polygon mesh under linear deformation using the weighting scheme referred to by Jin et al. as "mean weighted by areas of adjacent triangles."
Abstract: In this paper, we deal with shading of linearly deforming triangular meshes that deform in time so that each vertex travels independently along its linear trajectory. We will show how the vertex normal can be computed efficiently for an arbitrary triangular polygon mesh under linear deformation using the weighting scheme referred to by Jin et al. as "mean weighted by areas of adjacent triangles." Our computation approach is also faster than simple normal recomputation. Moreover, it is more accurate than the usual linear interpolation. The proposed approach is general enough to be used to compute the vertex normal for any number of adjacent faces.

2 citations


Proceedings Article
01 Jan 2007
TL;DR: A novel out-of-core technique for construction of the Delaunay triangulation of such large data sets is proposed based on the method of incremental insertion with flipping that is simple, robust and can be easily extended for weights of points, constraints, etc.
Abstract: In the last couple of years, very detailed highresolution terrain data sets have become available thanks to new acquisition techniques, e.g., the airborne laser scanning. Such data sets contain, typically, several millions of points and, therefore, several gigabytes are required just to store them, which disallows their loading into the memory of a common computer. In this paper, we propose a novel out-of-core technique for construction of the Delaunay triangulation of such large data sets. It is based on the method of incremental insertion with flipping that is simple, robust and can be easily extended for weights of points, constraints, etc. The proposed technique was tested on various data sets with sizes up to 128M points on a commodity hardware (P4 3.2GHz, 2GB RAM, 250GB SATA disk). The largest data set was processed in about 2.5 hours.

2 citations


Journal ArticleDOI
01 Feb 2007
TL;DR: Some observations and modifications for this step of surface reconstruction are presented, which improve the quality of the resulting mesh especially in the case, when the surface contains boundaries.
Abstract: One of the methods for 3D data model acquisition is real object digitisation followed by surface reconstruction. Many algorithms have been developed to do this, each with its advantages and disadvantages. We use for the reconstruction a CRUST algorithm by Nina Amenta, which selects surface triangles from the Delaunay tetrahedronisation using information from the dual Voronoi diagram. Unfortunately, these candidate surface triangles do not form a manifold, so it is necessary to perform some other steps for manifold extraction. In this paper, we present some observations and modifications for this step, which improve the quality of the resulting mesh especially in the case, when the surface contains boundaries.

1 citations


Proceedings Article
01 Jan 2007
TL;DR: In this article, the authors use a CRUST algorithm for surface reconstruction, which selects surface triangles from the Delaunay tetrahedronisation using information from the dual Voronoi diagram.
Abstract: One of the methods for 3D data model acquisition is real object digitisation followed by surface reconstruction. Many algorithms have been developed to do this, each with its advantages and disadvantages. We use for the reconstruction a CRUST algorithm by Nina Amenta, which selects surface triangles from the Delaunay tetrahedronisation using information from the dual Voronoi diagram. Unfortunately, these candidate surface triangles do not form a manifold, so it is necessary to perform some other steps for manifold extraction. In this paper, we present some observations and modifications for this step, which improve the quality of the resulting mesh especially in the case, when the surface contains boundaries.

1 citations


Proceedings Article
01 Jan 2007
TL;DR: It is presented an argumentation that a course oriented to the applied computational geometry should be a part of computer graphics curriculum as it teaches effective algorithmic methods and helps to develop an abstract thinking.
Abstract: The paper surveys main features of computational geometry and presents the argument that a course oriented to applied computational geometry should be a part of the computer graphics curriculum, as it teaches effective algorithmic methods and helps to develop abstract thinking. Possible contents of the course and forms suitable and interesting for computer graphics students are discussed. The students' feedback on such a course has been mostly positive.

12 Mar 2007
TL;DR: Two new types of tunnel approximations are introduced and compared with the existing method of the tunnel computation and the application of regular triangulation for its computation are described.
Abstract: Proteins are organic compounds and are contained in every living cell. Protein engineering studies proteins and tries to modify th em. For this purpose it uses (among others) an analysis of tunnels – paths, whic h lead from an interesting place inside a protein to its surface. In this pape r the geometric approximation of tunnels in protein molecules and the application of regular triangulation for its computation are described. Two new types of tunnel approximations are introduced and compared with the existing method of the tunnel computation.