scispace - formally typeset
Search or ask a question

Showing papers by "Alejandro López-Ortiz published in 2016"


Journal ArticleDOI
01 Jan 2016
TL;DR: In this article, the authors considered the online bin packing problem under the advice complexity model and provided tight upper and lower bounds for the amount of advice an algorithm needs to achieve an optimal packing.
Abstract: We consider the online bin packing problem under the advice complexity model where the "online constraint" is relaxed and an algorithm receives partial information about the future items. We provide tight upper and lower bounds for the amount of advice an algorithm needs to achieve an optimal packing. We also introduce an algorithm that, when provided with $$\log n + o(\log n)$$logn+o(logn) bits of advice, achieves a competitive ratio of $$3/2$$3/2 for the general problem. This algorithm is simple and is expected to find real-world applications. We introduce another algorithm that receives $$2n + o(n)$$2n+o(n) bits of advice and achieves a competitive ratio of $$4/3 + \varepsilon $$4/3+?. Finally, we provide a lower bound argument that implies that advice of linear size is required for an algorithm to achieve a competitive ratio better than 9/8.

59 citations


Journal ArticleDOI
TL;DR: This work considers the problem of managing a bounded size First-In-First-Out (FIFO) queue buffer, where each incoming unit-sized packet requires several rounds of processing before it can be transmitted out.
Abstract: We consider the problem of managing a bounded size First-In-First-Out (FIFO) queue buffer, where each incoming unit-sized packet requires several rounds of processing before it can be transmitted out. Our objective is to maximize the total number of successfully transmitted packets. We consider both push-out (when a policy is permitted to drop already admitted packets) and non-push-out cases. We provide worst-case guarantees for the throughput performance of our algorithms, proving both lower and upper bounds on their competitive ratio against the optimal algorithm, and conduct a comprehensive simulation study that experimentally validates predicted theoretical behavior.

17 citations


Journal ArticleDOI
TL;DR: It is shown that for graphs of size N and treewidth α, there is an online algorithm that receives O (n(log α + log log N)* bits of advice and optimally serves any sequence of length n and it is proved that if a graph admits a system of μ collective tree (q, r)-spanners, then there is a (q + r)-competitive algorithm which requires O ( n(log μ +log log N)) bits of Advice.
Abstract: We consider the k-Server problem under the advice model of computation when the underlying metric space is sparse. On one side, we introduce ź(1)-competitive algorithms for a wide range of sparse graphs. These algorithms require advice of (almost) linear size. We show that for graphs of size N and treewidth ź, there is an online algorithm that receives O (n(log ź + log log N))* bits of advice and optimally serves any sequence of length n. We also prove that if a graph admits a system of μ collective tree (q, r)-spanners, then there is a (q + r)-competitive algorithm which requires O (n(log μ + log log N)) bits of advice. Among other results, this gives a 3-competitive algorithm for planar graphs, when provided with O (n log log N) bits of advice. On the other side, we prove that advice of size Ω(n) is required to obtain a 1-competitive algorithm for sequences of length n even for the 2-server problem on a path metric of size N ź 3. Through another lower bound argument, we show that at least n2(logźź1.22)$\frac {n}{2}(\log \alpha - 1.22)$ bits of advice is required to obtain an optimal solution for metric spaces of treewidth ź, where 4 ≤ ź < 2k.

8 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of managing a bounded size queue buffer where traffic consists of packets of varying size, each packet requires several rounds of processing before it can be transmitted out, and the goal is to maximize the throughput, i.e., total size of successfully transmitted packets.

5 citations


Posted Content
TL;DR: The $\Pi-Packing with $\alpha()$-Overlap problem is introduced to allow for more complex constraints in the overlap region than those previously studied, and several examples of $\alpha(),$ functions which meet those conditions are given.
Abstract: In earlier versions of the community discovering problem, the overlap between communities was restricted by a simple count upper-bound [17,5,11,8]. In this paper, we introduce the $\Pi$-Packing with $\alpha()$-Overlap problem to allow for more complex constraints in the overlap region than those previously studied. Let $\mathcal{V}^r$ be all possible subsets of vertices of $V(G)$ each of size at most $r$, and $\alpha: \mathcal{V}^r \times \mathcal{V}^r \to \{0,1\}$ be a function. The $\Pi$-Packing with $\alpha()$-Overlap problem seeks at least $k$ induced subgraphs in a graph $G$ subject to: (i) each subgraph has at most $r$ vertices and obeys a property $\Pi$, and (ii) for any pair $H_i,H_j$, with $i eq j$, $\alpha(H_i, H_j) = 0$ (i.e., $H_i,H_j$ do not conflict). We also consider a variant that arises in clustering applications: each subgraph of a solution must contain a set of vertices from a given collection of sets $\mathcal{C}$, and no pair of subgraphs may share vertices from the sets of $\mathcal{C}$. In addition, we propose similar formulations for packing hypergraphs. We give an $O(r^{rk} k^{(r+1)k} n^{cr})$ algorithm for our problems where $k$ is the parameter and $c$ and $r$ are constants, provided that: i) $\Pi$ is computable in polynomial time in $n$ and ii) the function $\alpha()$ satisfies specific conditions. Specifically, $\alpha()$ is hereditary, applicable only to overlapping subgraphs, and computable in polynomial time in $n$. Motivated by practical applications we give several examples of $\alpha()$ functions which meet those conditions.

3 citations


Book ChapterDOI
01 Jan 2016
TL;DR: This work compares the quality of the solution obtained by the online algorithm with the one computed in the presence of full information, namely, that of the offline optimal OPT, in the worst case.
Abstract: While the competitive ratio [19] is the most common metric in online algorithm analysis and it has led to a vast amount of knowledge in the field, there are numerous known applications in which the competitive ratio produces unsatisfactory results. Far too often, it leads to unrealistically pessimistic measures including the failure to distinguish between algorithms that have vastly differing performance under any practical characterization in practice. Because of this there, has been extensive research in alternatives to the competitive ratio, with a renewed effort in the period from 2005 to the present date. The competitive ratio metric can be derived from the observation that an online algorithm, in essence, computes a partial solution to a problem using incomplete information. Then, it is only natural to quantify the performance drop due to this absence of information. That is, we compare the quality of the solution obtained by the online algorithm with the one computed in the presence of full information, namely, that of the offline optimal OPT, in the worst case. More formally,

2 citations


Book ChapterDOI
29 Mar 2016
TL;DR: In this paper, the problem of multiple agents or robots searching in a coordinated fashion for a target in the plane was considered, and an optimal strategy for searching with k robots starting from a common origin and moving at unit speed was developed.
Abstract: We consider the problem of multiple agents or robots searching in a coordinated fashion for a target in the plane. This is motivated by Search and Rescue operations (SAR) in the high seas which in the past were often performed with several vessels, and more recently by swarms of aerial drones and/or unmanned surface vessels. Coordinating such a search in an effective manner is a non trivial task. In this paper, we develop first an optimal strategy for searching with k robots starting from a common origin and moving at unit speed. We then apply the results from this model to more realistic scenarios such as differential search speeds, late arrival times to the search effort and low probability of detection under poor visibility conditions. We show that, surprisingly, the theoretical idealized model still governs the search with certain suitable minor adaptations.

2 citations


DOI
01 Jan 2016
TL;DR: The extended abstracts included in this report contain both recent state of the art advances and lay the foundation for new directions within data structures research.
Abstract: This report documents the program and the outcomes of Dagstuhl Seminar 16101 "Data Structures and Advanced Models of Computation on Big Data". In today's computing environment vast amounts of data are processed, exchanged and analyzed. The manner in which information is stored profoundly influences the efficiency of these operations over the data. In spite of the maturity of the field many data structuring problems are still open, while new ones arise due to technological advances. The seminar covered both recent advances in the "classical" data structuring topics as well as new models of computation adapted to modern architectures, scientific studies that reveal the need for such models, applications where large data sets play a central role, modern computing platforms for very large data, and new data structures for large data in modern architectures. The extended abstracts included in this report contain both recent state of the art advances and lay the foundation for new directions within data structures research.

1 citations


Posted Content
TL;DR: In this paper, the problem of determining if a given graph corresponds to the dual of a triangulation of a simple polygon is investigated, and the difficulty of this problem depends critically on the amount of information given and a sharp boundary between the various tractable and intractable versions of the problem.
Abstract: We investigate the problem of determining if a given graph corresponds to the dual of a triangulation of a simple polygon. This is a graph recognition problem, where in our particular case we wish to recognize a graph which corresponds to the dual of a triangulation of a simple polygon with or without holes and interior points. We show that the difficulty of this problem depends critically on the amount of information given and we give a sharp boundary between the various tractable and intractable versions of the problem.

1 citations


Posted Content
TL;DR: In this paper, the authors consider the case in which the counters are associated with the nodes, which for the case of dual graphs of geometric spaces could be argued to be intuitively more natural and likely more efficient.
Abstract: We give lower bounds for various natural node- and edge-based local strategies for exploring a graph. We consider this problem both in the setting of an arbitrary graph as well as the abstraction of a geometric exploration of a space by a robot, both of which have been extensively studied. We consider local exploration policies that use time-of-last- visit or alternatively least-frequently-visited local greedy strategies to select the next step in the exploration path. Both of these strategies were previously considered by Cooper et al. (2011) for a scenario in which counters for the last visit or visit frequency are attached to the edges. In this work we consider the case in which the counters are associated with the nodes, which for the case of dual graphs of geometric spaces could be argued to be intuitively more natural and likely more efficient. Surprisingly, these alternate strategies give worst-case superpolynomial/ exponential time for exploration, whereas the least-frequently visited strategy for edges has a polynomially bounded exploration time, as shown by Cooper et al. (2011).


Proceedings Article
01 Jan 2016
TL;DR: The problem of determining if a given graph corresponds to the dual of a triangulation of a simple polygon is investigated and a sharp boundary is given between the various tractable and intractable versions of the problem.
Abstract: We investigate the problem of determining if a given graph corresponds to the dual of a triangulation of a simple polygon. This is a graph recognition problem, where in our particular case we wish to recognize a graph which corresponds to the dual of a triangulation of a simple polygon with or without holes and interior points. We show that the difficulty of this problem depends critically on the amount of information given and we give a sharp boundary between the various tractable and intractable versions of the problem.

Book ChapterDOI
29 Mar 2016
TL;DR: Surprisingly, these alternate strategies give worst-case superpolynomial/exponential time for exploration, whereas the least-frequently-visited strategy for edges has a polynomially bounded exploration time, as shown by Cooper et al. (2011).
Abstract: We give lower bounds for various natural node- and edge-based local strategies for exploring a graph. We consider this problem both in the setting of an arbitrary graph as well as the abstraction of a geometric exploration of a space by a robot, both of which have been extensively studied. We consider local exploration policies that use time-of-last-visit or alternatively least-frequently-visited local greedy strategies to select the next step in the exploration path. Both of these strategies were previously considered by Cooper et al. (2011) for a scenario in which counters for the last visit or visit frequency are attached to the edges. In this work we consider the case in which the counters are associated with the nodes, which for the case of dual graphs of geometric spaces could be argued to be intuitively more natural and likely more efficient. Surprisingly, these alternate strategies give worst-case superpolynomial/exponential time for exploration, whereas the least-frequently-visited strategy for edges has a polynomially bounded exploration time, as shown by Cooper et al. (2011).