scispace - formally typeset
Search or ask a question
Journal ArticleDOI

New applications of random sampling in computational geometry

Kenneth L. Clarkson1
01 Jun 1987-Discrete and Computational Geometry (Springer New York)-Vol. 2, Iss: 1, pp 195-222
TL;DR: This paper gives several new demonstrations of the usefulness of random sampling techniques in computational geometry by creating a search structure for arrangements of hyperplanes by sampling the hyperplanes and using information from the resulting arrangement to divide and conquer.
Abstract: This paper gives several new demonstrations of the usefulness of random sampling techniques in computational geometry. One new algorithm creates a search structure for arrangements of hyperplanes by sampling the hyperplanes and using information from the resulting arrangement to divide and conquer. This algorithm requiresO(sd+?) expected preprocessing time to build a search structure for an arrangement ofs hyperplanes ind dimensions. The expectation, as with all expected times reported here, is with respect to the random behavior of the algorithm, and holds for any input. Given the data structure, and a query pointp, the cell of the arrangement containingp can be found inO(logs) worst-case time. (The bound holds for any fixed ?>0, with the constant factors dependent ond and ?.) Using point-plane duality, the algorithm may be used for answering halfspace range queries. Another algorithm finds random samples of simplices to determine the separation distance of two polytopes. The algorithm uses expectedO(n[d/2]) time, wheren is the total number of vertices of the two polytopes. This matches previous results [10] for the cased = 3 and extends them. Another algorithm samples points in the plane to determine their orderk Voronoi diagram, and requires expectedO(s1+?k) time fors points. (It is assumed that no four of the points are cocircular.) This sharpens the boundO(sk2 logs) for Lee's algorithm [21], andO(s2 logs+k(s?k) log2s) for Chazelle and Edelsbrunner's algorithm [4]. Finally, random sampling is used to show that any set ofs points inE3 hasO(sk2 log8s/(log logs)6) distinctj-sets withj≤k. (ForS ?Ed, a setS? ?S with |S?| =j is aj-set ofS if there is a half-spaceh+ withS? =S ?h+.) This sharpens with respect tok the previous boundO(sk5) [5]. The proof of the bound given here is an instance of a "probabilistic method" [15].

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: The Voronoi diagram as discussed by the authors divides the plane according to the nearest-neighbor points in the plane, and then divides the vertices of the plane into vertices, where vertices correspond to vertices in a plane.
Abstract: Computational geometry is concerned with the design and analysis of algorithms for geometrical problems. In addition, other more practically oriented, areas of computer science— such as computer graphics, computer-aided design, robotics, pattern recognition, and operations research—give rise to problems that inherently are geometrical. This is one reason computational geometry has attracted enormous research interest in the past decade and is a well-established area today. (For standard sources, we refer to the survey article by Lee and Preparata [19841 and to the textbooks by Preparata and Shames [1985] and Edelsbrunner [1987bl.) Readers familiar with the literature of computational geometry will have noticed, especially in the last few years, an increasing interest in a geometrical construct called the Voronoi diagram. This trend can also be observed in combinatorial geometry and in a considerable number of articles in natural science journals that address the Voronoi diagram under different names specific to the respective area. Given some number of points in the plane, their Voronoi diagram divides the plane according to the nearest-neighbor

4,236 citations

Proceedings ArticleDOI
Kenneth L. Clarkson1
06 Jan 1988
TL;DR: Asymptotically tight bounds for a combinatorial quantity of interest in discrete and computational geometry, related to halfspace partitions of point sets, are given.
Abstract: Random sampling is used for several new geometric algorithms. The algorithms are “Las Vegas,” and their expected bounds are with respect to the random behavior of the algorithms. One algorithm reports all the intersecting pairs of a set of line segments in the plane, and requires O(A + n log n) expected time, where A is the size of the answer, the number of intersecting pairs reported. The algorithm requires O(n) space in the worst case. Another algorithm computes the convex hull of a point set in E3 in O(n log A) expected time, where n is the number of points and A is the number of points on the surface of the hull. A simple Las Vegas algorithm triangulates simple polygons in O(n log log n) expected time. Algorithms for half-space range reporting are also given. In addition, this paper gives asymptotically tight bounds for a combinatorial quantity of interest in discrete and computational geometry, related to halfspace partitions of point sets.

1,163 citations

Proceedings ArticleDOI
01 Jan 1993
TL;DR: The up-tree (vantage point tree) is introduced in several forms, together‘ with &&ciated algorithms, as an improved method for these difficult search problems in general metric spaces.
Abstract: We consider the computational problem of finding nearest neighbors in general metric spaces. Of particular interest are spaces that may not be conveniently embedded or approximated in Euclidian space, or where the dimensionality of a Euclidian representation 1s very high. Also relevant are high-dimensional Euclidian settings in which the distribution of data is in some sense of lower dimension and embedded in the space. The up-tree (vantage point tree) is introduced in several forms, together‘ with &&ciated algorithms, as an improved method for these difficult search nroblems. Tree construcI tion executes in O(nlog(n i ) time, and search is under certain circumstances and in the imit, O(log(n)) expected time. The theoretical basis for this approach is developed and the results of several experiments are reported. In Euclidian cases, kd-tree performance is compared.

1,145 citations


Cites methods from "New applications of random sampling..."

  • ...More recently, the Voronoi digram [21] has provided a useful tool in low- dimensional Euclidian settings { and Figure 1: vp-tree decomposition Figure 2: kd-tree decomposition the overall eld and outlook of Computational Geometry has yielded many interesting results such as those of [22, 23, 24, 25] and earlier [26]....

    [...]

Proceedings ArticleDOI
01 Oct 1998
TL;DR: New packet classification schemes are presented that, with a worst-case and traffic-independent performance metric, can classify packets, by checking amongst a few thousand filtering rules, at rates of a million packets per second using range matches on more than 4 packet header fields.
Abstract: The ability to provide differentiated services to users with widely varying requirements is becoming increasingly important, and Internet Service Providers would like to provide these differentiated services using the same shared network infrastructure. The key mechanism, that enables differentiation in a connectionless network, is the packet classification function that parses the headers of the packets, and after determining their context, classifies them based on administrative policies or real-time reservation decisions. Packet classification, however, is a complex operation that can become the bottleneck in routers that try to support gigabit link capacities. Hence, many proposals for differentiated services only require classification at lower speed edge routers and also avoid classification based on multiple fields in the packet header even if it might be advantageous to service providers. In this paper, we present new packet classification schemes that, with a worst-case and traffic-independent performance metric, can classify packets, by checking amongst a few thousand filtering rules, at rates of a million packets per second using range matches on more than 4 packet header fields. For a special case of classification in two dimensions, we present an algorithm that can handle more than 128K rules at these speeds in a traffic independent manner. We emphasize worst-case performance over average case performance because providing differentiated services requires intelligent queueing and scheduling of packets that precludes any significant queueing before the differentiating step (i.e., before packet classification). The presented filtering or classification schemes can be used to classify packets for security policy enforcement, applying resource management decisions, flow identification for RSVP reservations, multicast look-ups, and for source-destination and policy based routing. The scalability and performance of the algorithms have been demonstrated by implementation and testing in a prototype system.

741 citations

Journal ArticleDOI
Kenneth L. Clarkson1
TL;DR: These results are tied together, stronger convergence results are reviewed, and several coreset bounds are generalized or strengthened.
Abstract: The problem of maximizing a concave function f(x) in the unit simplex Δ can be solved approximately by a simple greedy algorithm. For given k, the algorithm can find a point x(k) on a k-dimensional face of Δ, such that f(x(k) ≥ f(xa) − O(1/k). Here f(xa) is the maximum value of f in Δ, and the constant factor depends on f. This algorithm and analysis were known before, and related to problems of statistics and machine learning, such as boosting, regression, and density mixture estimation. In other work, coming from computational geometry, the existence of ϵ-coresets was shown for the minimum enclosing ball problem by means of a simple greedy algorithm. Similar greedy algorithms, which are special cases of the Frank-Wolfe algorithm, were described for other enclosure problems. Here these results are tied together, stronger convergence results are reviewed, and several coreset bounds are generalized or strengthened.

456 citations

References
More filters
Journal ArticleDOI
TL;DR: This work presents a new planar convex hull algorithm with worst case time complexity O(n \log H) where n is the size of the input set and H is thesize of the output set, i.e. the number of vertices found to be on the hull.
Abstract: We present a new planar convex hull algorithm with worst case time complexity $O(n \log H)$ where $n$ is the size of the input set and $H$ is the size of the output set, i.e. the number of vertices found to be on the hull. We also show that this algorithm is asymptotically worst case optimal on a rather realistic model of computation even if the complexity of the problem is measured in terms of input as well as output size. The algorithm relies on a variation of the divide-and-conquer paradigm which we call the ``marriage-before-conquest'''' principle and which appears to be interesting in its own right.

416 citations


"New applications of random sampling..." refers background or methods in this paper

  • ...Kirkpatrick and Seidel obtained a deterministic algorithm for planar convex hulls with the same time bound [ 31 ]....

    [...]

  • ...This is done as follows: let t be a triangular face of a region F 2 ( R), with p a vertex of t. Then the polygon Pt = t\P(S) can be determined using the algorithm of Kirkpatrick and Seidel [ 31 ] in time on the order of jFj log At, where At is the number of sides of Pt. All but two of the sides of Pt correspond to faces ofP(S), so that the total time to compute all such polygons is expected O(n log A0), where A0 is...

    [...]

BookDOI
01 Mar 1999
TL;DR: Inverse relations with the Harmonic Numbers Recurrence Relations (HRSR) have been studied in the context of operator calculus and hypergeometric series in this article, where they have been shown to be useful in the identification of useful identities.
Abstract: Preface Binomial Identities.- Summary of Useful Identities.- Deriving the Identities.- Inverse Relations.- Operator Calculus.- Hypergeometric Series.- Identities with the Harmonic Numbers Recurrence Relations.- Linear Recurrence Relations.- Nonlinear Recurrence Relations Operator Methods.- The Cookie Monster.- Coalesced Hashing.- Open Addressing: Uniform Hashing.- Open Addressing: Secondary Clustering Asymptotic Analysis.- Basic Concepts.- Stieltjes Integration and Asymptotics.- Asymptotics from Generating Functions Bibliography Appendices.- Schedule of Lectures.- Homework Assignments.- Midterm Exam I and Solutions.- Final Exam I and Solutions.- Midterm Exam II and Solutions.- Final Exam II and Solutions.- Midterm Exam III and Solutions.- Final Exam III and Solutions.- A Qualifying Exam Problem and Solution Index

381 citations

Proceedings ArticleDOI
01 Aug 1986
TL;DR: A new technique for half-space and simplex range query using random sampling to build a partition-tree structure and introduces the concept of anε-net for an abstract set of ranges to describe the desired result of this random sampling.
Abstract: We present a new technique for half-space and simplex range query using O(n) space and O(na) query time, where a 0. These bounds are better than those previously published for all d ≥ 2. The technique uses random sampling to build a partition-tree structure. We introduce the concept of an e-net for an abstract set of ranges to describe the desired result of this random sampling and give necessary and sufficient conditions that a random sample is an e-net with high probability. We illustrate the application of these ideas to other range query problems.

378 citations

Journal ArticleDOI
TL;DR: It is shown that the k-nearest neighbor problem and other seemingly unrelated problems can be solved efficiently with the Voronoi diagram.
Abstract: The notion of Voronoi diagram for a set of N points in the Euclidean plane is generalized to the Voronoi diagram of order k and an iterative algorithm to construct the generalized diagram in 0(k2N log N) time using 0(k2(N − k)) space is presented. It is shown that the k-nearest neighbor problem and other seemingly unrelated problems can be solved efficiently with the diagram.

361 citations


"New applications of random sampling..." refers methods in this paper

  • ...If s > k(r − 7), then the time required includes that for computing a constant number of order O(j∗) Voronoi diagrams of r sites, requiring O(rj∗ 2 log r) time, using the [21] algorithm....

    [...]

  • ...) This sharpens the bound O(sk(2) log s) for Lee’s algorithm [21], and O(s(2) log s + k(s − k) log(2) s) for Chazelle and Edelsbrunner’s algorithm [4]....

    [...]

  • ...(By [21], there are O(sk) such triples....

    [...]

  • ...Lee [21] showed that the order k Voronoi diagram on s sites has O(k(s − k)) regions, and he gave an algorithm requiring O(sk(2) log s) time for the construction of such diagrams....

    [...]

Journal ArticleDOI
TL;DR: It turns out that the standard Euclidean Voronoi diagram of point sets inRd along with its order-k generalizations are intimately related to certain arrangements of hyperplanes, and this fact can be used to obtain new Vor onoi diagram algorithms.
Abstract: We propose a uniform and general framework for defining and dealing with Voronoi diagrams. In this framework a Voronoi diagram is a partition of a domainD induced by a finite number of real valued functions onD. Valuable insight can be gained when one considers how these real valued functions partitionD ×R. With this view it turns out that the standard Euclidean Voronoi diagram of point sets inRd along with its order-k generalizations are intimately related to certain arrangements of hyperplanes. This fact can be used to obtain new Voronoi diagram algorithms. We also discuss how the formalism of arrangements can be used to solve certain intersection and union problems.

346 citations