scispace - formally typeset
Search or ask a question
Journal ArticleDOI

New applications of random sampling in computational geometry

Kenneth L. Clarkson1
01 Jun 1987-Discrete and Computational Geometry (Springer New York)-Vol. 2, Iss: 1, pp 195-222
TL;DR: This paper gives several new demonstrations of the usefulness of random sampling techniques in computational geometry by creating a search structure for arrangements of hyperplanes by sampling the hyperplanes and using information from the resulting arrangement to divide and conquer.
Abstract: This paper gives several new demonstrations of the usefulness of random sampling techniques in computational geometry. One new algorithm creates a search structure for arrangements of hyperplanes by sampling the hyperplanes and using information from the resulting arrangement to divide and conquer. This algorithm requiresO(sd+?) expected preprocessing time to build a search structure for an arrangement ofs hyperplanes ind dimensions. The expectation, as with all expected times reported here, is with respect to the random behavior of the algorithm, and holds for any input. Given the data structure, and a query pointp, the cell of the arrangement containingp can be found inO(logs) worst-case time. (The bound holds for any fixed ?>0, with the constant factors dependent ond and ?.) Using point-plane duality, the algorithm may be used for answering halfspace range queries. Another algorithm finds random samples of simplices to determine the separation distance of two polytopes. The algorithm uses expectedO(n[d/2]) time, wheren is the total number of vertices of the two polytopes. This matches previous results [10] for the cased = 3 and extends them. Another algorithm samples points in the plane to determine their orderk Voronoi diagram, and requires expectedO(s1+?k) time fors points. (It is assumed that no four of the points are cocircular.) This sharpens the boundO(sk2 logs) for Lee's algorithm [21], andO(s2 logs+k(s?k) log2s) for Chazelle and Edelsbrunner's algorithm [4]. Finally, random sampling is used to show that any set ofs points inE3 hasO(sk2 log8s/(log logs)6) distinctj-sets withj≤k. (ForS ?Ed, a setS? ?S with |S?| =j is aj-set ofS if there is a half-spaceh+ withS? =S ?h+.) This sharpens with respect tok the previous boundO(sk5) [5]. The proof of the bound given here is an instance of a "probabilistic method" [15].

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The Voronoi diagram as discussed by the authors divides the plane according to the nearest-neighbor points in the plane, and then divides the vertices of the plane into vertices, where vertices correspond to vertices in a plane.
Abstract: Computational geometry is concerned with the design and analysis of algorithms for geometrical problems. In addition, other more practically oriented, areas of computer science— such as computer graphics, computer-aided design, robotics, pattern recognition, and operations research—give rise to problems that inherently are geometrical. This is one reason computational geometry has attracted enormous research interest in the past decade and is a well-established area today. (For standard sources, we refer to the survey article by Lee and Preparata [19841 and to the textbooks by Preparata and Shames [1985] and Edelsbrunner [1987bl.) Readers familiar with the literature of computational geometry will have noticed, especially in the last few years, an increasing interest in a geometrical construct called the Voronoi diagram. This trend can also be observed in combinatorial geometry and in a considerable number of articles in natural science journals that address the Voronoi diagram under different names specific to the respective area. Given some number of points in the plane, their Voronoi diagram divides the plane according to the nearest-neighbor

4,236 citations

Proceedings ArticleDOI
Kenneth L. Clarkson1
06 Jan 1988
TL;DR: Asymptotically tight bounds for a combinatorial quantity of interest in discrete and computational geometry, related to halfspace partitions of point sets, are given.
Abstract: Random sampling is used for several new geometric algorithms. The algorithms are “Las Vegas,” and their expected bounds are with respect to the random behavior of the algorithms. One algorithm reports all the intersecting pairs of a set of line segments in the plane, and requires O(A + n log n) expected time, where A is the size of the answer, the number of intersecting pairs reported. The algorithm requires O(n) space in the worst case. Another algorithm computes the convex hull of a point set in E3 in O(n log A) expected time, where n is the number of points and A is the number of points on the surface of the hull. A simple Las Vegas algorithm triangulates simple polygons in O(n log log n) expected time. Algorithms for half-space range reporting are also given. In addition, this paper gives asymptotically tight bounds for a combinatorial quantity of interest in discrete and computational geometry, related to halfspace partitions of point sets.

1,163 citations

Proceedings ArticleDOI
01 Jan 1993
TL;DR: The up-tree (vantage point tree) is introduced in several forms, together‘ with &&ciated algorithms, as an improved method for these difficult search problems in general metric spaces.
Abstract: We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation 1s very high. Also relevant are high-dimensional Euclidian settings in which the distribution of data is in some sense of lower dimension and embedded in the space. The up-tree (vantage point tree) is introduced in several forms, together‘ with &&ciated algorithms, as an improved method for these difficult search nroblems. Tree construcI tion executes in O(nlog(n i ) time, and search is under certain circumstances and in the imit, O(log(n)) expected time. The theoretical basis for this approach is developed and the results of several experiments are reported. In Euclidian cases, kd-tree performance is compared.

1,145 citations


Cites methods from "New applications of random sampling..."

  • ...More recently, the Voronoi digram [21] has provided a useful tool in low- dimensional Euclidian settings { and Figure 1: vp-tree decomposition Figure 2: kd-tree decomposition the overall eld and outlook of Computational Geometry has yielded many interesting results such as those of [22, 23, 24, 25] and earlier [26]....

    [...]

Proceedings ArticleDOI
01 Oct 1998
TL;DR: New packet classification schemes are presented that, with a worst-case and traffic-independent performance metric, can classify packets, by checking amongst a few thousand filtering rules, at rates of a million packets per second using range matches on more than 4 packet header fields.
Abstract: The ability to provide differentiated services to users with widely varying requirements is becoming increasingly important, and Internet Service Providers would like to provide these differentiated services using the same shared network infrastructure. The key mechanism, that enables differentiation in a connectionless network, is the packet classification function that parses the headers of the packets, and after determining their context, classifies them based on administrative policies or real-time reservation decisions. Packet classification, however, is a complex operation that can become the bottleneck in routers that try to support gigabit link capacities. Hence, many proposals for differentiated services only require classification at lower speed edge routers and also avoid classification based on multiple fields in the packet header even if it might be advantageous to service providers. In this paper, we present new packet classification schemes that, with a worst-case and traffic-independent performance metric, can classify packets, by checking amongst a few thousand filtering rules, at rates of a million packets per second using range matches on more than 4 packet header fields. For a special case of classification in two dimensions, we present an algorithm that can handle more than 128K rules at these speeds in a traffic independent manner. We emphasize worst-case performance over average case performance because providing differentiated services requires intelligent queueing and scheduling of packets that precludes any significant queueing before the differentiating step (i.e., before packet classification). The presented filtering or classification schemes can be used to classify packets for security policy enforcement, applying resource management decisions, flow identification for RSVP reservations, multicast look-ups, and for source-destination and policy based routing. The scalability and performance of the algorithms have been demonstrated by implementation and testing in a prototype system.

741 citations

Journal ArticleDOI
Kenneth L. Clarkson1
TL;DR: These results are tied together, stronger convergence results are reviewed, and several coreset bounds are generalized or strengthened.
Abstract: The problem of maximizing a concave function f(x) in the unit simplex Δ can be solved approximately by a simple greedy algorithm. For given k, the algorithm can find a point x(k) on a k-dimensional face of Δ, such that f(x(k) ≥ f(xa) − O(1/k). Here f(xa) is the maximum value of f in Δ, and the constant factor depends on f. This algorithm and analysis were known before, and related to problems of statistics and machine learning, such as boosting, regression, and density mixture estimation. In other work, coming from computational geometry, the existence of ϵ-coresets was shown for the minimum enclosing ball problem by means of a simple greedy algorithm. Similar greedy algorithms, which are special cases of the Frank-Wolfe algorithm, were described for other enclosure problems. Here these results are tied together, stronger convergence results are reviewed, and several coreset bounds are generalized or strengthened.

456 citations

References
More filters
Journal ArticleDOI
TL;DR: An algorithm to test whether their intersection is empty, and if so to find a separating plane, and to construct their intersection polyhedron is developed, which runs in timeO (n log n), where n is the sum of the numbers of vertices of the two polyhedra.

311 citations

Journal ArticleDOI
TL;DR: The fundamental result is that a K-dimensional Euclidean Voronoi diagram of N points can be constructed by transforming the points to K + I-space, which extends straightforwardly to higher dimensions.

302 citations


"New applications of random sampling..." refers background in this paper

  • ...(In fact the mapping γ is not unique in this regard: see [13, 23, 2 ].) Proof....

    [...]

Journal ArticleDOI
TL;DR: A new formulation of the notion of duality that allows the unified treatment of a number of geometric problems is used, to solve two long-standing problems of computational geometry and to obtain a quadratic algorithm for computing the minimum-area triangle with vertices chosen amongn points in the plane.
Abstract: This paper uses a new formulation of the notion of duality that allows the unified treatment of a number of geometric problems. In particular, we are able to apply our approach to solve two long-standing problems of computational geometry: one is to obtain a quadratic algorithm for computing the minimum-area triangle with vertices chosen amongn points in the plane; the other is to produce an optimal algorithm for the half-plane range query problem. This problem is to preprocessn points in the plane, so that given a test half-plane, one can efficiently determine all points lying in the half-plane. We describe an optimalO(k + logn) time algorithm for answering such queries, wherek is the number of points to be reported. The algorithm requiresO(n) space andO(n logn) preprocessing time. Both of these results represent significant improvements over the best methods previously known. In addition, we give a number of new combinatorial results related to the computation of line arrangements.

286 citations


"New applications of random sampling..." refers result in this paper

  • ...These results do not improve the algorithm of [ 7 ] for halfplane queries; that algorithm requires O(n) storage, O(n log n) preprocessing, and O(A + log n) query time....

    [...]

Proceedings ArticleDOI
07 Nov 1983
TL;DR: An optimal algorithm is presented for constructing an arrangement of hyperplanes in arbitrary dimensions and is shown to improve known worst-case time complexities for five problems: computing all order-k Voronoi diagrams, computing the λ-matrix, estimating halfspace queries, degeneracy testing, and finding the minimum volume simplex determined by a set of points.
Abstract: An optimal algorithm is presented for constructing an arrangement of hyperplanes in arbitrary dimensions. It relies on a combinatorial result that is of interest in its own right. The algorithm is shown to improve known worst-case time complexities for five problems: computing all order-k Voronoi diagrams, computing the λ-matrix, estimating halfspace queries, degeneracy testing, and finding the minimum volume simplex determined by a set of points.

285 citations


"New applications of random sampling..." refers background or methods in this paper

  • ...The lemma follows immediately for simple arrangements, using the O(rd) bound on the number of vertices of an arrangement in Ed [ 12 ]....

    [...]

  • ...Edelsbrunner and others [ 12 ] give an algorithm for determining from S the facial structure of AS, that is, the faces of AS and their containment relations....

    [...]

  • ...A simple arrangement is one for which every intersection of k hyperplanes is a (d − k)-flat, for 1 ≤ k ≤ d + 1. (Following [ 12 ], the empty set is a (−1)-flat, by convention.) A k-face f...

    [...]

Journal ArticleDOI
TL;DR: Classic binary search is extended to multidimensional search problems and yields efficient algorithms for a number of tasks such as a secondary searching problem of Knuth, region location in planar graphs, and speech recognition.
Abstract: Classic binary search is extended to multidimensional search problems. This extension yields efficient algorithms for a number of tasks such as a secondary searching problem of Knuth, region location in planar graphs, and speech recognition.

262 citations


"New applications of random sampling..." refers result in this paper

  • ...On the other hand, its preprocessing time and storage compare quite well with those of previous algorithms for range queries having query times that are O(log s) [10, 9]....

    [...]