scispace - formally typeset
Search or ask a question

Showing papers by "Jeff Erickson published in 1995"


Proceedings Article
01 Jan 1995
TL;DR: This paper considers the relative complexities of a large number of computational geometry problems whose complexities are believed to be roughly (n4=3), and surveys known reductions among problems involving lines in three-space, and among higher dimensional closestpair problems.
Abstract: We consider the relative complexities of a large number of computational geometry problems whose complexities are believed to be roughly (n4=3). For certain pairs of problems, we show that the complexity of one problem is asymptotically bounded by the complexity of the other. Almost all of the problems we consider can be solved in time O(n ) or better, and there are (n) lower bounds for a few of them in specialized models of computation. However, the best known lower bound in any general model of computation is only (n logn). The paper is naturally divided into two parts. In the rst part, we consider a large number of problems that are harder than Hopcroft's problem. These problems include various ray shooting problems, sorting line segments in IR, collision detection in IR, and halfspace emptiness checking in IR. In the second, we survey known reductions among problems involving lines in three-space, and among higher dimensional closestpair problems. Some of our results rely on the introduction of formal in nitesimals during reduction; we show that such a reduction is meaningful in the algebraic decision tree model.

53 citations


Journal ArticleDOI
TL;DR: It is shown that in the worst case, Ω(nd) sidedness queries are required to determine whether a set ofn points in ℝd is affinely degenerate, i.e., whether it containsd+1 points on a common hyperplane.
Abstract: We show that in the worst case, Ω(nd) sidedness queries are required to determine whether a set ofn points in ?d is affinely degenerate, i.e., whether it containsd+1 points on a common hyperplane. This matches known upper bounds. We give a straightforward adversary argument, based on the explicit construction of a point set containing Ω(nd) "collapsible" simplices, any one of which can be made degenerate without changing the orientation of any other simplex. As an immediate corollary, we have an Ω(nd) lower bound on the number of sidedness queries required to determine the order type of a set ofn points in ?d. Using similar techniques, we also show that Ω(nd+1) in-sphere queries are required to decide the existence of spherical degeneracies in a set ofn points in ?d.

41 citations


Proceedings ArticleDOI
22 Jan 1995
TL;DR: In this article, a lower bound of Ω(ndr/2e) was shown for the problem of determining whether a fixed linear equation in r variables, given n real numbers, does any r of them satisfy the equation?
Abstract: We prove an Ω(ndr/2e) lower bound for the following problem: For some fixed linear equation in r variables, given n real numbers, do any r of them satisfy the equation? Our lower bound holds in a restricted linear decision tree model, in which each decision is based on the sign of an arbitrary linear combination of r or fewer inputs. In this model, our lower bound is as large as possible. Previously, this lower bound was known only for a few special cases and only in more specialized models of computation. Our lower bound follows from an adversary argument. We show that for any algorithm, there is a input that contains Ω(ndr/2e) “critical” r-tuples, which have the following important property. None of the critical tuples satisfies the equation; however, if the algorithm does not directly test each critical tuple, then the adversary can modify the input, in a way that is undetectable to the algorithm, so that some untested tuple does satisfy the equation. A key step in the proof is the introduction of formal infinitesimals into the adversary input. A theorem of Tarski implies that if we can construct a single input containing infinitesimals that is hard for every algorithm, then for every decision tree algorithm there exists a corresponding real-valued input which is hard for that algorithm. An extended abstract of this paper can be found in [Eri95].

18 citations


Proceedings ArticleDOI
01 Sep 1995
TL;DR: A combinatorial representation of the relative order type of a set of points and hyperplanes, called amonochromatic cover, is defined, and lower bounds on its size in the worst case are derived, showing that the running time of any partitioning algorithm is bounded below by the size of some monochromatics cover.
Abstract: We establish new lower bounds on the complexity of the following basic geometric problem, attributed to John Hopcroft: Given a set ofn points andm hyperplanes in\(\mathbb{R}^d \), is any point contained in any hyperplane? We define a general class ofpartitioning algorithms, and show that in the worst case, for allm andn, any such algorithm requires time Ω(n logm + n2/3m2/3 + m logn) in two dimensions, or Ω(n logm + n5/6m1/2 + n1/2m5/6 + m logn) in three or more dimensions. We obtain slightly higher bounds for the counting version of Hopcroft's problem in four or more dimensions. Our planar lower bound is within a factor of 2O(log*(n+m)) of the best known upper bound, due to Matousek. Previously, the best known lower bound, in any dimension, was Ω(n logm + m logn). We develop our lower bounds in two stages. First we define a combinatorial representation of the relative order type of a set of points and hyperplanes, called amonochromatic cover, and derive lower bounds on its size in the worst case. We then show that the running time of any partitioning algorithm is bounded below by the size of some monochromatic cover. As a related result, using a straightforward adversary argument, we derive aquadratic lower bound on the complexity of Hopcroft's problem in a surprisingly powerful decision tree model of computation.

8 citations