scispace - formally typeset
Search or ask a question

Showing papers by "Aravind Srinivasan published in 2008"


Proceedings ArticleDOI
13 Apr 2008
TL;DR: This paper develops polynomial time algorithms to provably approximate the total throughput in this setting of the capacity estimation problem using the more general Signal to Interference Plus Noise Ratio model for interference, on arbitrary wireless networks.
Abstract: A fundamental problem in wireless networks is to estimate its throughput capacity - given a set of wireless nodes, and a set of connections, what is the maximum rate at which data can be sent on these connections. Most of the research in this direction has focused on either random distributions of points, or has assumed simple graph-based models for wireless interference. In this paper, we study capacity estimation problem using the more general Signal to Interference Plus Noise Ratio (SINR) model for interference, on arbitrary wireless networks. The problem becomes much harder in this setting, because of the non-locality of the SINR model. Recent work by Moscibroda et al. (2006) has shown that the throughput in this model can differ from graph based models significantly. We develop polynomial time algorithms to provably approximate the total throughput in this setting.

118 citations


Proceedings Article
20 Jan 2008
TL;DR: The running time of all efficient algorithmic versions of the Local Lemma (which cover all applications in the Molloy & Reed framework) to polynomial is improved, and the parallel algorithm for hypergraph coloring due to Alon is improved.
Abstract: The Lovasz Local Lemma is a powerful tool in combinatorics and computer science The original version of the lemma was nonconstructive, and efficient algorithmic versions have been developed by Beck, Alon, Molloy & Reed, et al In particular, the work of Molloy & Reed lets us automatically extract efficient versions of essentially any application of the symmetric version of the Local Lemma However, with some exceptions, there is a significant gap between what one can prove using the original Lemma nonconstructively, and what is possible through these efficient versions; also, some of these algorithmic versions run in super-polynomial time Here, we lessen this gap, and improve the running time of all these applications (which cover all applications in the Molloy & Reed framework) to polynomial We also improve upon the parallel algorithmic version of the Local Lemma for hypergraph coloring due to Alon, by allowing noticeably more overlap among the edges

66 citations


Book ChapterDOI
25 Aug 2008
TL;DR: A rounding of the natural LP relaxation is conducted to show that the full-information budgeted-allocation problem can be approximated to within 4/3: the known lower-bound on the integrality gap is matched.
Abstract: We build on the work of Andelman & Mansour and Azar, Birnbaum, Karlin, Mathieu & Thach Nguyen to show that the full-information (i.e., offline) budgeted-allocation problem can be approximated to within 4/3: we conduct a rounding of the natural LP relaxation, for which our algorithm matches the known lower-bound on the integrality gap.

44 citations


Proceedings ArticleDOI
13 Apr 2008
TL;DR: This work designs simple and distributed channel-access strategies for random-access networks which are provably competitive with respect to the optimal scheduling strategy, which is deterministic, centralized, and computationally infeasible.
Abstract: We study the throughput capacity of wireless networks which employ (asynchronous) random-access scheduling as opposed to deterministic scheduling. The central question we answer is: how should we set the channel-access probability for each link in the network so that the network operates close to its optimal throughput capacity? We design simple and distributed channel-access strategies for random-access networks which are provably competitive with respect to the optimal scheduling strategy, which is deterministic, centralized, and computationally infeasible. We show that the competitiveness of our strategies are nearly the best achievable via random-access scheduling, thus establishing fundamental limits on the performance of random- access. A notable outcome of our work is that random access compares well with deterministic scheduling when link transmission durations differ by small factors, and much worse otherwise. The distinguishing aspects of our work include modeling and rigorous analysis of asynchronous communication, asymmetry in link transmission durations, and hidden terminals under arbitrary link-conflict based wireless interference models.

24 citations


Journal ArticleDOI
TL;DR: A parameterized backbone construction algorithm that permits explicit trade-offs between backbone size, resilience to node movement and failure, energy consumption, and path lengths is presented and it is proved that the scheme can construct essentially best possible backbones when the network is relatively static.
Abstract: We consider the problem of finding "backbones" in multihop wireless networks. The backbone provides end-to-end connectivity, allowing nonbackbone nodes to save energy since they do not have to route nonlocal data or participate in the routing protocol. Ideally, such a backbone would be small, consist primarily of high capacity nodes, and remain connected even when nodes are mobile or fail. Unfortunately, it is often infeasible to construct a backbone that has all of these properties; e.g., a small optimal backbone is often too sparse to handle node failures or high mobility. We present a parameterized backbone construction algorithm that permits explicit trade-offs between backbone size, resilience to node movement and failure, energy consumption, and path lengths. We prove that our scheme can construct essentially best possible backbones (with respect to energy consumption and backbone size) when the network is relatively static. We generalize our scheme to build more robust structures better suited to networks with higher mobility. We present a distributed protocol based upon our algorithm and show that this protocol builds and maintains a connected backbone in dynamic networks. Finally, we present detailed packet-level simulation results to evaluate and compare our scheme with existing energy-saving techniques. Our results show that, depending on the network environment, our scheme increases network lifetimes by 20 percent to 220 percent without adversely affecting delivery ratio or end-to-end latency.

13 citations


Book ChapterDOI
07 Jul 2008
TL;DR: A simple symmetry-breaking augmentation to the randomized coloring procedure is presented that works well in conjunction with Azuma's Martingale Inequality to easily yield the requisite concentration bounds for key random variables.
Abstract: A basic randomized coloring procedurehas been used inprobabilistic proofs to obtain remarkably strong results on graphcoloring. These results include the asymptotic version of the ListColoring Conjecture due to Kahn, the extensions of Brooks' Theoremto sparse graphs due to Kim and Johansson, and Luby's fast paralleland distributed algorithms for graph coloring. The most challengingaspect of a typical probabilistic proof is showing adequateconcentration bounds for key random variables. In this paper, wepresent a simple symmetry-breaking augmentation to the randomizedcoloring procedure that works well in conjunction with Azuma'sMartingale Inequalityto easily yield the requisiteconcentration bounds. We use this approach to obtain a number ofresults in two areas: frugal coloringand weightedequitable coloring. A β-frugal coloringof agraph Gis a proper vertex-coloring of Gin whichno color appears more than βtimes in anyneighborhood. Let G= (V, E) be avertex-weighted graph with weight function w: V→[0, 1] and let W= ΣveVw(v). A weighted equitablecoloringof Gis a proper k-coloring suchthat the total weight of every color class is "large", i.e., "notmuch smaller" than W/k; this notion is useful inobtaining tail bounds for sums of dependent random variables.

10 citations



Journal ArticleDOI
TL;DR: An elementary approach is presented to obtain a Chernoff-type upper-tail bound for the number of prime factors of a random integer in {1,2,...,n}.

1 citations