scispace - formally typeset
Search or ask a question
Author

John Iacono

Other affiliations: New York University, Aarhus University, Rutgers University  ...read more
Bio: John Iacono is an academic researcher from Université libre de Bruxelles. The author has contributed to research in topics: Data structure & Amortized analysis. The author has an hindex of 24, co-authored 170 publications receiving 2130 citations. Previous affiliations of John Iacono include New York University & Aarhus University.


Papers
More filters
Journal ArticleDOI
01 May 2004
TL;DR: This work presents a data structure that is optimized for answering queries quickly when they are geometrically close to the previous successful query, and works with a variety of distance functions.
Abstract: In the 2D point searching problem, the goal is to preprocess n points P = {P1,..., pn} in the plane so that, for an online sequence of query points q1, ...,qm, it can quickly be determined which (if any) of the elements of P are equal to each query point qi. This problem can be solved in O(logn) time by mapping the problem to one dimension. We present a data structure that is optimized for answering queries quickly when they are geometrically close to the previous successful query. Specifically, our data structure executes queries in time O(log d(qi-1, qi)), where d is some distance function between two points, and uses O(n logn) space. Our structure works with a variety of distance functions. In contrast, it is proved that, for some of the most intuitive distance functions d, it is impossible to obtain an O (log d(qi-1, qi)) runtime, or any bound that is o(log n).

18 citations

Journal ArticleDOI
TL;DR: The 3POL problem, an algebraic generalization of 3SUM where the sum function is replaced by a constant-degree polynomial in three variables, is considered, and it is proved that 3POL admits bounded-degree algebraic decision trees of depth.
Abstract: The 3SUM problem asks if an input n-set of real numbers contains a triple whose sum is zero. We qualify such a triple of degenerate because the probability of finding one in a random input is zero. We consider the 3POL problem, an algebraic generalization of 3SUM where we replace the sum function by a constant-degree polynomial in three variables. The motivations are threefold. Raz et al. gave an $$O(n^{11/6})$$ upper bound on the number of degenerate triples for the 3POL problem. We give algorithms for the corresponding problem of counting them. Gronlund and Pettie designed subquadratic algorithms for 3SUM. We prove that 3POL admits bounded-degree algebraic decision trees of depth $$O(n^{12/7+\varepsilon })$$ , and we prove that 3POL can be solved in $$O(n^2 {(\log \log n)}^{3/2} / {(\log n)}^{1/2})$$ time in the real-RAM model, generalizing their results. Finally, we shed light on the General Position Testing (GPT) problem: “Given n points in the plane, do three of them lie on a line?”, a key problem in computational geometry: we show how to solve GPT in subquadratic time when the input points lie on a small number of constant-degree polynomial curves. Many other geometric degeneracy testing problems reduce to 3POL.

18 citations

Proceedings ArticleDOI
08 Jun 2003
TL;DR: A new data structure is presented for planar point location that executes a point location query quickly if it is spatially near the previous query when the triangulation is of size n.
Abstract: A new data structure is presented for planar point location that executes a point location query quickly if it is spatially near the previous query. Given a triangulation T of size n and a sequence of point location queries A=q1, qm, the structure presented executes qi in time O(log d(qi-1,qi)). The distance function, d, that is used is a two dimensional generalization of rank distance that counts the number of triangles in a region from qi-1 to qi. The data structure uses O(n log log n) space.

16 citations

Posted Content
19 Oct 2004
TL;DR: It is proved that the optimal number of memory transfers is .
Abstract: Consider laying out a fixed-topology tree of N nodes into external memory with block size B so as to minimize the worst-case number of block memory transfers required to traverse a path from the root to a node of depth D. We prove that the optimal number of memory transfers is

16 citations

Journal ArticleDOI
TL;DR: An algorithm is presented that preprocesses a connected planar subdivision G of size n and a query distribution D to produce a point location data structure for G that has an expected query time that is within a constant factor of optimal.
Abstract: A data structure is presented for point location in connected planar subdivisions when the distribution of queries is known in advance. The data structure has an expected query time that is within a constant factor of optimal. More specifically, an algorithm is presented that preprocesses a connected planar subdivision G of size n and a query distribution D to produce a point location data structure for G. The expected number of point-line comparisons performed by this data structure, when the queries are distributed according to D, is H˜ + O(H˜1/2+1) where H˜=H˜(G,D) is a lower bound on the expected number of point-line comparisons performed by any linear decision tree for point location in G under the query distribution D. The preprocessing algorithm runs in O(n log n) time and produces a data structure of size O(n). These results are obtained by creating a Steiner triangulation of G that has near-minimum entropy.

16 citations


Cited by
More filters
Book
02 Jan 1991

1,377 citations

Journal Article
TL;DR: A deterministic algorithm for triangulating a simple polygon in linear time is given, using the polygon-cutting theorem and the planar separator theorem, whose role is essential in the discovery of new diagonals.
Abstract: We give a deterministic algorithm for triangulating a simple polygon in linear time. The basic strategy is to build a coarse approximation of a triangulation in a bottom-up phase and then use the information computed along the way to refine the triangulation in a top-down phase. The main tools used are the polygon-cutting theorem, which provides us with a balancing scheme, and the planar separator theorem, whose role is essential in the discovery of new diagonals. Only elementary data structures are required by the algorithm. In particular, no dynamic search trees, of our algorithm.

632 citations

01 Jan 1978

366 citations

Journal ArticleDOI
TL;DR: In this article, scale-independent elementary geometric constructions and constrained optimization algorithms can be used to determine spatially modulated patterns that yield approximations to given surfaces of constant or varying curvature.
Abstract: Origami describes rules for creating folded structures from patterns on a flat sheet, but does not prescribe how patterns can be designed to fit target shapes. Here, starting from the simplest periodic origami pattern that yields one-degree-of-freedom collapsible structures-we show that scale-independent elementary geometric constructions and constrained optimization algorithms can be used to determine spatially modulated patterns that yield approximations to given surfaces of constant or varying curvature. Paper models confirm the feasibility of our calculations. We also assess the difficulty of realizing these geometric structures by quantifying the energetic barrier that separates the metastable flat and folded states. Moreover, we characterize the trade-off between the accuracy to which the pattern conforms to the target surface, and the effort associated with creating finer folds. Our approach enables the tailoring of origami patterns to drape complex surfaces independent of absolute scale, as well as the quantification of the energetic and material cost of doing so.

336 citations

Journal ArticleDOI
TL;DR: The proposed method incorporates the voting method into the popular extreme learning machine (ELM) in classification applications and generally outperforms the original ELM algorithm as well as several recent classification algorithms.

329 citations