scispace - formally typeset
Search or ask a question

Showing papers on "Delaunay triangulation published in 2011"


Book ChapterDOI
21 Sep 2011
TL;DR: It is shown that any tree can be realized as the Delaunay graph of its embedded vertices, which implies useful properties such as guaranteed greedy routing and realization as minimum spanning trees.
Abstract: This paper considers the problem of embedding trees into the hyperbolic plane. We show that any tree can be realized as the Delaunay graph of its embedded vertices. Particularly, a weighted tree can be embedded such that the weight on each edge is realized as the hyperbolic distance between its embedded vertices. Thus the embedding preserves the metric information of the tree along with its topology. The distance distortion between non adjacent vertices can be made arbitrarily small --- less than a (1+e) factor for any given e. Existing results on low distortion of embedding discrete metrics into trees carry over to hyperbolic metric through this result. The Delaunay character implies useful properties such as guaranteed greedy routing and realization as minimum spanning trees.

144 citations


Proceedings ArticleDOI
18 Apr 2011
TL;DR: A scalable logo recognition approach that extends the common bag-of-words model and incorporates local geometry in the indexing process and represents triangles by signatures capturing both visual appearance and local geometry.
Abstract: We propose a scalable logo recognition approach that extends the common bag-of-words model and incorporates local geometry in the indexing process. Given a query image and a large logo database, the goal is to recognize the logo contained in the query, if any. We locally group features in triples using multi-scale Delaunay triangulation and represent triangles by signatures capturing both visual appearance and local geometry. Each class is represented by the union of such signatures over all instances in the class. We see large scale recognition as a sub-linear search problem where signatures of the query image are looked up in an inverted index structure of the class models. We evaluate our approach on a large-scale logo recognition dataset with more than four thousand classes.

134 citations


Book
08 Sep 2011
TL;DR: In this paper, the authors prove central limit theorems and establish rates of convergence for the following problems in geometrical probability when points are generated in the 0,1,1/brack^2$ cube according to a Poisson point process with parameter $n$: 1. The length of the nearest graph, in which each point is connected to its nearest neighbor. 2.
Abstract: We prove central limit theorems and establish rates of convergence for the following problems in geometrical probability when points are generated in the $\lbrack 0,1\rbrack^2$ cube according to a Poisson point process with parameter $n$: 1. The length of the nearest graph $N_{k,n}$, in which each point is connected to its $k$th nearest neighbor. 2. The length of the Delaunay triangulation $\operatorname{Del}_n$ of the points. 3. The length of the Voronoi diagram $\operatorname{Vor}_n$ of the points. Using the technique of dependency graphs of Baldi and Rinott, we show that the dependence range in all these problems converges quickly to 0 with high probability. Our approach also establishes rates of convergence for the number of points in the convex hull and the area outside the convex hull for points generated according to a Poisson point process in a circle.

102 citations


Journal ArticleDOI
TL;DR: A robust 2D shape reconstruction and simplification algorithm which takes as input a defect‐laden point set with noise and outliers and construct the resulting simplicial complex through greedy decimation of a Delaunay triangulation of the input point set is proposed.
Abstract: We propose a robust 2D shape reconstruction and simplification algorithm which takes as input a defect-laden point set with noise and outliers. We introduce an optimal-transport driven approach where the input point set, considered as a sum of Dirac measures, is approximated by a simplicial complex considered as a sum of uniform measures on 0- and 1-simplices. A fine-to-coarse scheme is devised to construct the resulting simplicial complex through greedy decimation of a Delaunay triangulation of the input point set. Our method performs well on a variety of examples ranging from line drawings to grayscale images, with or without noise, features, and boundaries.

101 citations


Journal ArticleDOI
TL;DR: An adaptive spatial clustering algorithm based on Delaunay triangulation (ASCDT for short) that can automatically discover clusters of complicated shapes, and non-homogeneous densities in a spatial database, without the need to set parameters or prior knowledge is proposed.

95 citations


Journal ArticleDOI
TL;DR: In this article, the failure of concrete from a mesoscopic point of view was studied using the Delaunay triangulation technique and the effects of mesostructural features such as aggregate grading curve, aggregate volumetric share, and more importantly the controlling parameters of mortar's damage-plasticity constitutive model have been investigated.

84 citations


Journal ArticleDOI
TL;DR: Numerical experiments indicate that the methods developed can produce well-shaped triangulations in a robust and efficient way, and leads to a new global mesh optimization scheme.

78 citations


Book ChapterDOI
01 Jan 2011
TL;DR: The use of Voronoi tessellations for grid generation, especially on the whole sphere or in regions on the sphere, and the computational solution of model equations based on CVTs on the spheres are discussed.
Abstract: We review the use of Voronoi tessellations for grid generation, especially on the whole sphere or in regions on the sphere. Voronoi tessellations and the corresponding Delaunay tessellations in regions and surfaces on Euclidean space are defined and properties they possess that make them well-suited for grid generation purposes are discussed, as are algorithms for their construction. This is followed by a more detailed look at one very special type of Voronoi tessellation, the centroidal Voronoi tessellation (CVT). After defining them, discussing some of their properties, and presenting algorithms for their construction, we illustrate the use of CVTs for producing both quasi-uniform and variable resolution meshes in the plane and on the sphere. Finally, we briefly discuss the computational solution of model equations based on CVTs on the sphere.

74 citations


Proceedings ArticleDOI
25 Jul 2011
TL;DR: Hodge-optimized triangulations (HOT) as discussed by the authors are a family of well-shaped primal-dual pairs of complexes designed for fast and accurate computations in computer graphics.
Abstract: We introduce Hodge-optimized triangulations (HOT), a family of well-shaped primal-dual pairs of complexes designed for fast and accurate computations in computer graphics. Previous work most commonly employs barycentric or circumcentric duals; while barycentric duals guarantee that the dual of each simplex lies within the simplex, circumcentric duals are often preferred due to the induced orthogonality between primal and dual complexes. We instead promote the use of weighted duals ("power diagrams"). They allow greater flexibility in the location of dual vertices while keeping primal-dual orthogonality, thus providing a valuable extension to the usual choices of dual by only adding one additional scalar per primal vertex. Furthermore, we introduce a family of functionals on pairs of complexes that we derive from bounds on the errors induced by diagonal Hodge stars, commonly used in discrete computations. The minimizers of these functionals, called HOT meshes, are shown to be generalizations of Centroidal Voronoi Tesselations and Optimal Delaunay Triangulations, and to provide increased accuracy and flexibility for a variety of computational purposes.

69 citations


Journal ArticleDOI
TL;DR: This paper proposes a practical algorithm based on the construction of a constrained Delaunay tetrahedralization for a set of constraints (segments and facets) that adds additional points (so‐called Steiner points) on segments only.

56 citations


Journal ArticleDOI
TL;DR: For the Delaunay triangulation of a set of points in the Euclidean plane, the best known bound is 1.998 as discussed by the authors, which is better than the current best upper bound of 2.42 by Keil and Gutwin.
Abstract: Let $S$ be a finite set of points in the Euclidean plane. Let $D$ be a Delaunay triangulation of $S$. The {\em stretch factor} (also known as {\em dilation} or {\em spanning ratio}) of $D$ is the maximum ratio, among all points $p$ and $q$ in $S$, of the shortest path distance from $p$ to $q$ in $D$ over the Euclidean distance $||pq||$. Proving a tight bound on the stretch factor of the Delaunay triangulation has been a long standing open problem in computational geometry. In this paper we prove that the stretch factor of the Delaunay triangulation of a set of points in the plane is less than $\rho = 1.998$, improving the previous best upper bound of 2.42 by Keil and Gutwin (1989). Our bound 1.998 is better than the current upper bound of 2.33 for the special case when the point set is in convex position by Cui, Kanj and Xia (2009). This upper bound breaks the barrier 2, which is significant because previously no family of plane graphs was known to have a stretch factor guaranteed to be less than 2 on any set of points.

Journal ArticleDOI
TL;DR: In this article, the authors show how to construct several geometric structures efficiently in the constant-work-space model, including the Voronoi diagram and the Euclidean minimum spanning tree (EMST).
Abstract: Constant-work-space algorithms may use only constantly many cells of storage in addition to their input, which is provided as a read-only array. We show how to construct several geometric structures efficiently in the constant-work-space model. Traditional algorithms process the input into a suitable data structure (like a doubly-connected edge list) that allows efficient traversal of the structure at hand. In the constant-work-space setting, however, we cannot afford to do this. Instead, we provide operations that compute the desired features on the fly by accessing the input with no extra space. The whole geometric structure can be obtained by using these operations to enumerate all the features. Of course, we must pay for the space savings by slower running times. While the standard data structure allows us to implement traversal operations in constant time, our schemes typically take linear time to read the input data in each step. We begin with two simple problems: triangulating a planar point set and finding the trapezoidal decomposition of a simple polygon. In both cases adjacent features can be enumerated in linear time per step, resulting in total quadratic running time to output the whole structure. Actually, we show that the former result carries over to the Delaunay triangulation, and hence the Voronoi diagram. This also means that we can compute the largest empty circle of a planar point set in quadratic time and constant work-space. As another application, we demonstrate how to enumerate the features of an Euclidean minimum spanning tree (EMST) in quadratic time per step, so that the whole EMST can be found in cubic time using constant work-space. Finally, we describe how to compute a shortest geodesic path between two points in a simple polygon. Although the shortest path problem in general graphs is NL-complete (Jakoby and Tantau 2003), this constrained problem can be solved in quadratic time using only constant work-space.

Journal ArticleDOI
TL;DR: A new methodology for the solution of the 2D diffusive shallow water equations over Delaunay unstructured triangular meshes is presented and several numerical experiments have been carried out to test the stability of the proposed model with regard to the size of the Courant number and to the mesh irregularity, and its computational performance.

Proceedings ArticleDOI
Ge Xia1
13 Jun 2011
TL;DR: This paper proves that the stretch factor of the Delaunay triangulation of a set of points in the plane is less than ρ = 1.998, improving the previous best upper bound of 2.42 by Keil and Gutwin (1989).
Abstract: Let S be a finite set of points in the Euclidean plane. Let D be a Delaunay triangulation of S. The stretch factor (also known as dilation or spanning ratio) of D is the maximum ratio, among all points p and q in S, of the shortest path distance from p to q in D over the Euclidean distance ||pq||. Proving a tight bound on the stretch factor of the Delaunay triangulation has been a long standing open problem in computational geometry.In this paper we prove that the stretch factor of the Delaunay triangulation of a set of points in the plane is less than ρ = 1.998, improving the previous best upper bound of 2.42 by Keil and Gutwin (1989).

Journal ArticleDOI
TL;DR: An original method for cluster selection in Atom Probe Tomography designed to be applied to large datasets based on the calculation of the Delaunay tessellation generated by the distribution of atoms of a selected element, which requires a single input parameter from the user.

Journal ArticleDOI
TL;DR: A robust parallel Delaunay triangulation algorithm for processing billions of points from nonoverlapped block LiDAR files called ParaStream, which targets ubiquitous multicore architectures and exploits most of the computing power of multicore platforms through parallel computing.

Journal ArticleDOI
TL;DR: The vertex set of the Capacity-Constrained Delaunay Triangulation (CCDT) is shown to have good blue noise characteristics, comparable in quality to those of state-of-the-art methods, achieved at a fraction of the runtime.

Journal ArticleDOI
15 Mar 2011-Sensors
TL;DR: This paper proposes a new coverage measurement method using Delaunay Triangulation (DT), which can provide the value for all coverage measurement tools and categorizes sensors as ‘fat’, ‘healthy’ or ‘thin’ to show the dense, optimal and scattered areas.
Abstract: Sensing and communication coverage are among the most important trade-offs in Wireless Sensor Network (WSN) design. A minimum bound of sensing coverage is vital in scheduling, target tracking and redeployment phases, as well as providing communication coverage. Some methods measure the coverage as a percentage value, but detailed information has been missing. Two scenarios with equal coverage percentage may not have the same Quality of Coverage (QoC). In this paper, we propose a new coverage measurement method using Delaunay Triangulation (DT). This can provide the value for all coverage measurement tools. Moreover, it categorizes sensors as 'fat', 'healthy' or 'thin' to show the dense, optimal and scattered areas. It can also yield the largest empty area of sensors in the field. Simulation results show that the proposed DT method can achieve accurate coverage information, and provides many tools to compare QoC between different scenarios.

Journal ArticleDOI
TL;DR: The performance of the proposed method is compared with that of the Cocone family and that of Ball Pivoting as regards the tessellation rate and the quality of the surface being generated from some benchmark point clouds and artificially noised test cases.
Abstract: This paper presents a new high-performance method for triangular mesh generation based on a mesh-growing approach. Starting from a seed triangle, the algorithm grows the triangular mesh by selecting a new point based on the Gabriel 2-Simplex criterion. This criterion can be considered to be a good approximation of the 2D Delaunay if the point cloud is well-sampled and not too rough. The performance of the proposed method is compared with that of the Cocone family and that of Ball Pivoting as regards the tessellation rate and the quality of the surface being generated from some benchmark point clouds and artificially noised test cases. The results are analysed and critically discussed.

Journal ArticleDOI
TL;DR: This paper shows how to construct point sets in convex position with stretch factor >1.5810 and in general position withStretch factor > 1.5846, and shows that a sufficiently large set of points drawn independently from any distribution will in the limit approach the worst-case stretch factor for that distribution.
Abstract: Consider the Delaunay triangulation T of a set P of points in the plane as a Euclidean graph, in which the weight of every edge is its length. It has long been conjectured that the stretch factor in T of any pair p,p^'@?P, which is the ratio of the length of the shortest path from p to p^' in T over the Euclidean distance @?pp^'@?, can be at most @p/2~1.5708. In this paper, we show how to construct point sets in convex position with stretch factor >1.5810 and in general position with stretch factor >1.5846. Furthermore, we show that a sufficiently large set of points drawn independently from any distribution will in the limit approach the worst-case stretch factor for that distribution.

Journal ArticleDOI
TL;DR: A Conforming Delaunay Triangulation (CDT) algorithm based on maximal Poisson disk sampling that works well in practice, and has the blue-noise property, and is also fast and uses little memory.
Abstract: We present a Conforming Delaunay Triangulation (CDT) algorithm based on maximal Poisson disk sampling. Points are unbiased, meaning the probability of introducing a vertex in a disk-free subregion is proportional to its area, except in a neighborhood of the domain boundary. In contrast, Delaunay refinement CDT algorithms place points dependent on the geometry of empty circles in intermediate triangulations, usually near the circle centers. Unconstrained angles in our mesh are between 30? and 120?, matching some biased CDT methods. Points are placed on the boundary using a one-dimensional maximal Poisson disk sampling. Any triangulation method producing angles bounded away from 0? and 180? must have some bias near the domain boundary to avoid placing vertices infinitesimally close to the boundary.Random meshes are preferred for some simulations, such as fracture simulations where cracks must follow mesh edges, because deterministic meshes may introduce non-physical phenomena. An ensemble of random meshes aids simulation validation. Poisson-disk triangulations also avoid some graphics rendering artifacts, and have the blue-noise property.We mesh two-dimensional domains that may be non-convex with holes, required points, and multiple regions in contact. Our algorithm is also fast and uses little memory. We have recently developed a method for generating a maximal Poisson distribution of n output points, where n = ? ( Area / r 2 ) and r is the sampling radius. It takes O ( n ) memory and O ( n log n ) expected time; in practice the time is nearly linear. This, or a similar subroutine, generates our random points. Except for this subroutine, we provably use O ( n ) time and space. The subroutine gives the location of points in a square background mesh. Given this, the neighborhood of each point can be meshed independently in constant time. These features facilitate parallel and GPU implementations. Our implementation works well in practice as illustrated by several examples and comparison to Triangle. Highlights? Conforming Delaunay triangulation algorithm based on maximal Poisson-disk sampling. ? Angles between 30? and 120?. ? Two-dimensional non-convex domains with holes, planar straight-line graphs. ? O ( n ) space, E ( n log n ) time; efficient in practice. Background squares ensure all computations are local.

Journal ArticleDOI
TL;DR: A reduction from DTs to nearest-neighbor graphs is described that relies on a new variant of randomized incremental constructions using dependent sampling and generalize to higher dimensions.
Abstract: We present several results about Delaunay triangulations (DTs) and convex hulls in transdichotomous and hereditary settings: (i) the DT of a planar point set can be computed in expected time O(sort(n)) on a word RAM, where sort(n) is the time to sort n numbers. We assume that the word RAM supports the shuffle operation in constant time; (ii) if we know the ordering of a planar point set in x- and in y-direction, its DT can be found by a randomized algebraic computation tree of expected linear depth; (iii) given a universe U of points in the plane, we construct a data structure D for Delaunay queries: for any P s U, D can find the DT of P in expected time O(vPv log log vUv); (iv) given a universe U of points in 3-space in general convex position, there is a data structure D for convex hull queries: for any P s U, D can find the convex hull of P in expected time O(vPv (log log vUv)2); (v) given a convex polytope in 3-space with n vertices which are colored with χ g 2 colors, we can split it into the convex hulls of the individual color classes in expected time O(n (log log n)2).The results (i)--(iii) generalize to higher dimensions, where the expected running time now also depends on the complexity of the resulting DT. We need a wide range of techniques. Most prominently, we describe a reduction from DTs to nearest-neighbor graphs that relies on a new variant of randomized incremental constructions using dependent sampling.

Journal ArticleDOI
TL;DR: This work shows how to leverage the knowledge of ℛ for faster Delaunay computation and optimally handles a wide variety of inputs, e.g., overlapping disks of different sizes and fat regions.
Abstract: Suppose we want to compute the Delaunay triangulation of a set P whose points are restricted to a collection ℛ of input regions known in advance. Building on recent work by Loffler and Snoeyink, we show how to leverage our knowledge of ℛ for faster Delaunay computation. Our approach needs no fancy machinery and optimally handles a wide variety of inputs, e.g., overlapping disks of different sizes and fat regions.

Journal ArticleDOI
TL;DR: Patient-specific abdominal aortic aneurysms (AAAs) are characterized by local curvature changes, which are assessed using a feature-based approach on topologies representative of the AAA outer wall surface using a Delaunay triangulation algorithm adapted for AAA segmented masks.
Abstract: Patient-specific abdominal aortic aneurysms (AAAs) are characterized by local curvature changes, which we assess using a feature-based approach on topologies representative of the AAA outer wall surface. The application of image segmentation methods yields 3D reconstructed surface polygons that contain low-quality elements, unrealistic sharp corners, and surface irregularities. To optimize the quality of the surface topology, an iterative algorithm was developed to perform interpolation of the AAA geometry, topology refinement, and smoothing. Triangular surface topologies are generated based on a Delaunay triangulation algorithm, which is adapted for AAA segmented masks. The boundary of the AAA wall is represented using a signed distance function prior to triangulation. The irregularities on the surface are minimized by an interpolation scheme and the initial coarse triangulation is refined by forcing nodes into equilibrium positions. A surface smoothing algorithm based on a low-pass filter is applied to remove sharp corners. The optimal number of iterations needed for polygon refinement and smoothing is determined by imposing a minimum average element quality index with no significant AAA sac volume change. This framework automatically generates high-quality triangular surface topologies that can be used to characterize local curvature changes of the AAA wall.

Proceedings Article
Ge Xia1, Liang Zhang1
01 Jan 2011
TL;DR: This paper proposes an improved lower bound of 1.5932 for the stretch factors of the Delaunay triangulation, derived from a sequence of chains sharing a set of properties and conjecture that these properties are also shared by a chain with the worst stretch factor.
Abstract: In this paper, we investigate the tight bound of the stretch factor of the Delaunay triangulation by studying the stretch factor of the chain (Xia 2011) We define a sequence Γ = (Γ1,Γ2,Γ3, ) where Γi is the maximum stretch factor of a chain of i circles, and show that Γ is strictly increasing We then present an improved lower bound of 15932 for the stretch factors of the Delaunay triangulation This bound is derived from a sequence of chains sharing a set of properties We conjecture that these properties are also shared by a chain with the worst stretch factor

Book ChapterDOI
01 Jan 2011
TL;DR: A survey of the existing solutions for geosensor network optimization that use Voronoi diagram and Delaunay triangulation and identifies their limitations in a real world application is presented and a more realistic approach is proposed by integrating spatial information in the optimization process based on Vor onoi diagram is proposed.
Abstract: Recent advances in sensor technology have resulted in the design and development of more efficient and low cast sensor networks for environmental monitoring, object surveillance, tracking and controlling of moving objects, etc. The deployment of a sensor network in a real environment presents several challenging issues that are often oversimplified in the existing solutions. Different approaches have been proposed in the literatures to solve this problem. Many of these approaches use Voronoi diagram and Delaunay triangulation to identify sensing holes in the network and create an optimal arrangement of the sensors to eliminate the holes. However, most of these methods do not consider the reality of the environment in which the sensor network is deployed. This paper presents a survey of the existing solutions for geosensor network optimization that use Voronoi diagram and Delaunay triangulation and identifies their limitations in a real world application. Next, it proposes a more realistic approach by integrating spatial information in the optimization process based on Voronoi diagram. Finally the results of two cases studies based on the proposed approach in natural area and urban environment are presented and discussed.

Journal ArticleDOI
TL;DR: Two new types of neighborhood graphs are presented: a variation on and a generalization of empty region graphs, which considerably improve the robustness of neighborhood-based analysis tools, such as topological decomposition.
Abstract: Sparse, irregular sampling is becoming a necessity for reconstructing large and high-dimensional signals. However, the analysis of this type of data remains a challenge. One issue is the robust selection of neighborhoods - a crucial part of analytic tools such as topological decomposition, clustering and gradient estimation. When extracting the topology of sparsely sampled data, common neighborhood strategies such as k-nearest neighbors may lead to inaccurate results, either due to missing neighborhood connections, which introduce false extrema, or due to spurious connections, which conceal true extrema. Other neighborhoods, such as the Delaunay triangulation, are costly to compute and store even in relatively low dimensions. In this paper, we address these issues. We present two new types of neighborhood graphs: a variation on and a generalization of empty region graphs, which considerably improve the robustness of neighborhood-based analysis tools, such as topological decomposition. Our findings suggest that these neighborhood graphs lead to more accurate topological representations of low- and high- dimensional data sets at relatively low cost, both in terms of storage and computation time. We describe the implications of our work in the analysis and visualization of scalar functions, and provide general strategies for computing and applying our neighborhood graphs towards robust data analysis.

Journal ArticleDOI
TL;DR: An approach to constructing a Pareto front approximation to computationally expensive multiobjective optimization problems is developed as a sub-complex of a Delaunay triangulation of a finite set of Pare to optimal outcomes to the problem.
Abstract: An approach to constructing a Pareto front approximation to computationally expensive multiobjective optimization problems is developed. The approximation is constructed as a sub-complex of a Delaunay triangulation of a finite set of Pareto optimal outcomes to the problem. The approach is based on the concept of inherent nondominance. Rules for checking the inherent nondominance of complexes are developed and applying the rules is demonstrated with examples. The quality of the approximation is quantified with error estimates. Due to its properties, the Pareto front approximation works as a surrogate to the original problem for decision making with interactive methods.

Journal ArticleDOI
TL;DR: In this paper, the Natural Neighbour Radial Point Interpolation Method (NNRPIM) is used in the numerical implementation of an Unconstrained Third-Order Plate Theory applied to laminates.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the ability of three reconstruction techniques to analyze and investigate weblike features and geometries in a discrete distribution of objects, including the linear Delaunay Tessellation Field Estimator (DTFE), its higher order equivalent Natural Neighbour Field Estimateator (NNFE) and a version of Kriging interpolation adapted to the specific circumstances encountered in galaxy redshift surveys, the Natural Lognormal Krigeling technique.
Abstract: We investigate the ability of three reconstruction techniques to analyze and investigate weblike features and geometries in a discrete distribution of objects. The three methods are the linear Delaunay Tessellation Field Estimator (DTFE), its higher order equivalent Natural Neighbour Field Estimator (NNFE) and a version of Kriging interpolation adapted to the specific circumstances encountered in galaxy redshift surveys, the Natural Lognormal Kriging technique. DTFE and NNFE are based on the local geometry defined by the Voronoi and Delaunay tessellations of the galaxy distribution. The three reconstruction methods are analysed and compared using mock magnitude-limited and volume-limited SDSS redshift surveys, obtained on the basis of the Millennium simulation. We investigate error trends, biases and the topological structure of the resulting fields, concentrating on the void population identified by the Watershed Void Finder. Environmental effects are addressed by evaluating the density fields on a range of Gaussian filter scales. Comparison with the void population in the original simulation yields the fraction of false void mergers and false void splits. In most tests DTFE, NNFE and Kriging have largely similar density and topology error behaviour. Cosmetically, higher order NNFE and Kriging methods produce more visually appealing reconstructions. Quantitatively, however, DTFE performs better, even while computationally far less demanding. A successful recovery of the void population on small scales appears to be difficult, while the void recovery rate improves significantly on scales > 3 h-1Mpc. A study of small scale voids and the void galaxy population should therefore be restricted to the local Universe, out to at most 100 h-1Mpc.