scispace - formally typeset
Search or ask a question

Showing papers by "Alejandro López-Ortiz published in 2006"


Book ChapterDOI
24 May 2006
TL;DR: A better algorithm for the intersection of large ordered sets is engineer, which improves over those proposed by Demaine, Munro and Lopez-Ortiz [SODA 2000/ALENEX 2001], by using a variant of interpolation search.
Abstract: The intersection of large ordered sets is a common problem in the context of the evaluation of boolean queries to a search engine. In this paper we engineer a better algorithm for this task, which improves over those proposed by Demaine, Munro and Lopez-Ortiz [SODA 2000/ALENEX 2001], by using a variant of interpolation search. More specifically, our contributions are threefold. First, we corroborate and complete the practical study from Demaine et al. on comparison based intersection algorithms. Second, we show that in practice replacing binary search and galloping (one-sided binary) search [4] by interpolation search improves the performance of each main intersection algorithms. Third, we introduce and test variants of interpolation search: this results in an even better intersection algorithm.

79 citations


Proceedings Article
16 Jul 2006
TL;DR: Matching (i.e. optimal) upper and lower bounds for the acceleration ratio under this simulation of contract algorithms can simulate interruptible algorithms using iterative deepening techniques are given.
Abstract: A contract algorithm is an algorithm which is given, as part of the input, a specified amount of allowable computation time. The algorithm must then compute a solution within the alloted time. An interruptible algorithm, in contrast, can be interrupted at an arbitrary point in time and must produce a solution. It is known that contract algorithms can simulate interruptible algorithms using iterative deepening techniques. This simulation is done at a penalty in the performance of the solution, as measured by the so-called acceleration ratio. In this paper we give matching (i.e. optimal) upper and lower bounds for the acceleration ratio under this simulation. This resolves an open conjecture of Bernstein et al. [IJCAI 2003] who gave an ingenious optimal schedule under the restricted setting of round robin and length-increasing processor schedules, but whose optimality in the general unrestricted case remained open.

17 citations


Proceedings Article
16 Jul 2006
TL;DR: This work presents a new propagator achieving bound consistency for the INTER-DISTANCE constraint, and proposes a quadratic propagator for the same level of consistency, which gives savings of an order of magnitude in the benchmark of scheduling problems.
Abstract: We present a new propagator achieving bound consistency for the INTER-DISTANCE constraint. This constraint ensures that, among a set of variables X1,..., Xn, the difference between two variables is at least p. This restriction models, in particular scheduling problems in which tasks require p contiguous units of a resource to be completed. Until now, the best known propagator for bound consistency had time complexity O(n3). In this work we propose a quadratic propagator for the same level of consistency. We then show that this theoretical gain gives savings of an order of magnitude in our benchmark of scheduling problems.

11 citations


Journal Article
TL;DR: In this paper, a comparison-based interpolation search algorithm for the intersection of large ordered sets is proposed. But the algorithm is not suitable for the case of boolean queries to a search engine.
Abstract: The intersection of large ordered sets is a common problem in the context of the evaluation of boolean queries to a search engine. In this paper we engineer a better algorithm for this task, which improves over those proposed by Demaine, Munro and Lopez-Ortiz [SODA 2000/ALENEX 2001], by using a variant of interpolation search. More specifically, our contributions are threefold. First, we corroborate and complete the practical study from Demaine et al. on comparison based intersection algorithms. Second, we show that in practice replacing binary search and galloping (one-sided binary) search [4] by interpolation search improves the performance of each main intersection algorithms. Third, we introduce and test variants of interpolation search: this results in an even better intersection algorithm.

4 citations


01 Jan 2006
TL;DR: In this paper, the adaptive/cooperative analysis (ACA) framework is used to analyze on-line algorithms for paging and list update problems, and it is shown that the ability of the adaptive algorithm to ignore pathological worst cases can lead to more efficient in practice.
Abstract: On-line algorithms are usually analyzed using competitive analysis, in which the performance of on-line algorithm on a sequence is normalized by the performance of the optimal on-line algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an alternative general framework for the analysis of on-line algorithms. This model gives promising results when applied to two well known on-line problems, paging and list update. The idea is to normalize the performance of an on-line algorithm by a measure other than the performance of the on-line optimal algorithm OPT. We show that in many instances the perform of OPT on a sequence is a coarse approximation of the difficulty or complexity of a given input. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise undistinguishable under the classical model. This createas a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the adaptive case. This confirms that the ability of the on-line adaptive algorithm to ignore pathological worst cases can lead to algorithms that are more efficient in practice.

1 citations