scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Improved Analysis of Deterministic Load-Balancing Schemes

TL;DR: In this article, the authors consider the problem of deterministic load balancing of tokens in the discrete model, where each node exchanges some of its tokens with each of its neighbors in the network.
Abstract: We consider the problem of deterministic load balancing of tokens in the discrete model. A set of n processors is connected into a d-regular undirected network. In every time step, each processor exchanges some of its tokens with each of its neighbors in the network. The goal is to minimize the discrepancy between the number of tokens on the most-loaded and the least-loaded processor as quickly as possible. Rabani et al. (1998) present a general technique for the analysis of a wide class of discrete load balancing algorithms. Their approach is to characterize the deviation between the actual loads of a discrete balancing algorithm with the distribution generated by a related Markov chain. The Markov chain can also be regarded as the underlying model of a continuous diffusion algorithm. Rabani et al. showed that after time T = O(log (Kn)/μ), any algorithm of their class achieves a discrepancy of O(d log n/μ), where μ is the spectral gap of the transition matrix of the graph, and K is the initial load discrepancy in the system.In this work we identify some natural additional conditions on deterministic balancing algorithms, resulting in a class of algorithms reaching a smaller discrepancy. This class contains well-known algorithms, e.g., the rotor-router. Specifically, we introduce the notion of cumulatively fair load-balancing algorithms where in any interval of consecutive time steps, the total number of tokens sent out over an edge by a node is the same (up to constants) for all adjacent edges. We prove that algorithms which are cumulatively fair and where every node retains a sufficient part of its load in each step, achieve a discrepancy of O(d√log n/μ ,d√n) in time O(T). We also show that in general neither of these assumptions may be omitted without increasing discrepancy. We then show by a combinatorial potential reduction argument that any cumulatively fair scheme satisfying some additional assumptions achieves a discrepancy of O(d) almost as quickly as the continuous diffusion process. This positive result applies to some of the simplest and most natural discrete load balancing schemes.
Citations
More filters
Proceedings ArticleDOI
22 Jul 2013
TL;DR: This work considers the setting in which multiple, indistinguishable agents are deployed in parallel in the nodes of the graph, and move around the graph in synchronous rounds, interacting with a single rotor-router system, and suggests a strong similarity between the performance characteristics of this deterministic model and random walks.
Abstract: The rotor-router mechanism was introduced as a deterministic alternative to the random walk in undirected graphs. In this model, an agent is initially placed at one of the nodes of the graph. Each node maintains a cyclic ordering of its outgoing arcs, and during successive visits of the agent, propagates it along arcs chosen according to this ordering in round-robin fashion. In this work we consider the setting in which multiple, indistinguishable agents are deployed in parallel in the nodes of the graph, and move around the graph in synchronous rounds, interacting with a single rotor-router system. We propose new techniques which allow us to perform a theoretical analysis of the multi-agent rotor-router model, and to compare it to the scenario of parallel independent random walks in a graph. Our main results concern the n-node ring, and suggest a strong similarity between the performance characteristics of this deterministic model and random walks.We show that on the ring the rotor-router with k agents admits a cover time of between Θ(n2/k2) in the best case and Θ(n2/ log k) in the worst case, depending on the initial locations of the agents, and that both these bounds are tight. The corresponding expected value of cover time for k random walks, depending on the initial locations of the walkers, is proven to belong to a similar range, namely between Θ(n2/(k2/ log2k)) and Θ(n2/ log k).Finally, we study the limit behavior of the rotor-router system. We show that, once the rotor-router system has stabilized, all the nodes of the ring are always visited by some agent every Θ(n/k) steps, regardless of how the system was initialized. This asymptotic bound corresponds to the expected time between successive visits to a node in the case of k random walks. All our results hold up to a polynomially large number of agents (1≤k

15 citations

Book ChapterDOI
07 Oct 2015
TL;DR: This work disproves the observation that the period of parallel rotor-router walks can in fact, be superpolynomial in the size of graph, and provides a characterization of the periodic behavior of parallel router walks, in terms of a structural property of stable states called a subcycle decomposition.
Abstract: The rotor-router model, also called the Propp machine, was introduced as a deterministic alternative to the random walk. In this model, a group of identical tokens are initially placed at nodes of the graph. Each node maintains a cyclic ordering of the outgoing arcs, and during consecutive turns the tokens are propagated along arcs chosen according to this ordering in round-robin fashion. The behavior of the model is fully deterministic. Yanovski eti¾?al. 2003 proved that a single rotor-router walk on any graph with m edges and diameter D stabilizes to a traversal of an Eulerian circuit on the set of all 2m directed arcs on the edge set of the graph, and that such periodic behaviour of the system is achieved after an initial transient phase of at most 2mD steps. The case of multiple parallel rotor-routers was studied experimentally, leading Yanovskii¾?eti¾?al. to the experimental observation that a system of $$k>1$$ parallel walks also stabilizes with a period of length at most 2m steps. In this work we disprove this observation, showing that the period of parallel rotor-router walks can in fact, be superpolynomial in the size of graph. On the positive side, we provide a characterization of the periodic behavior of parallel router walks, in terms of a structural property of stable states called a subcycle decomposition. This property provides us the tools to efficiently detect whether a given system configuration corresponds to the transient or to the limit behavior of the system. Moreover, we provide polynomial upper bounds of $$\mathcal {O}m^4D^2 + mD\log k$$ and $$\mathcal {O}m^5k^2$$ on the number of steps it takes for the system to stabilize. Thus, we are able to predict any future behavior of the system using an algorithm that takes polynomial time and space. In addition, we show that there exists a separation between the stabilization time of the single-walk and multiple-walk rotor-router systems, and that for some graphs the latter can be asymptotically larger even for the case of $$k=2$$ walks.

11 citations

Posted Content
TL;DR: It is proved that it is impossible to achieve perfect resilience on any non-planar graph, and it is shown that graph families which are closed under the subdivision of links, can allow for simple and efficient failover algorithms which simply skip failed links.
Abstract: In order to provide a high resilience and to react quickly to link failures, modern computer networks support fully decentralized flow rerouting, also known as local fast failover. In a nutshell, the task of a local fast failover algorithm is to pre-define fast failover rules for each node using locally available information only. These rules determine for each incoming link from which a packet may arrive and the set of local link failures (i.e., the failed links incident to a node), on which outgoing link a packet should be forwarded. Ideally, such a local fast failover algorithm provides a perfect resilience deterministically: a packet emitted from any source can reach any target, as long as the underlying network remains connected. Feigenbaum et al. showed that it is not always possible to provide perfect resilience and showed how to tolerate a single failure in any network. Interestingly, not much more is known currently about the feasibility of perfect resilience. This paper revisits perfect resilience with local fast failover, both in a model where the source can and cannot be used for forwarding decisions. We first derive several fairly general impossibility results: By establishing a connection between graph minors and resilience, we prove that it is impossible to achieve perfect resilience on any non-planar graph; furthermore, while planarity is necessary, it is also not sufficient for perfect resilience. On the positive side, we show that graph families which are closed under the subdivision of links, can allow for simple and efficient failover algorithms which simply skip failed links. We demonstrate this technique by deriving perfect resilience for outerplanar graphs and related scenarios, as well as for scenarios where the source and target are topologically close after failures.

10 citations

Journal ArticleDOI
TL;DR: The dependence of the lock-in time on the initial configuration of the rotor–router mechanism is examined in the form of a game between a player intending to lock- in the agent in an Euler tour as quickly as possible and its adversary with the counter objective.
Abstract: The rotor---router model, also called the Propp machine, was first considered as a deterministic alternative to the random walk. The edges adjacent to each node v (or equivalently, the exit ports at v) are arranged in a fixed cyclic order, which does not change during the exploration. Each node v maintains a port pointer $$\pi _v$$źv which indicates the exit port to be adopted by an agent on the conclusion of the next visit to this node (the "next exit port"). The rotor---router mechanism guarantees that after each consecutive visit at the same node, the pointer at this node is moved to the next port in the cyclic order. It is known that, in an undirected graph G with m edges, the route adopted by an agent controlled by the rotor---router mechanism eventually forms an Euler tour based on arcs obtained via replacing each edge in G by two arcs with opposite direction. The process of ushering the agent to an Euler tour is referred to as the lock-in problem. In Yanovski et al. (Algorithmica 37(3):165---186, 2003), it was proved that, independently of the initial configuration of the rotor---router mechanism in G, the agent locks-in in time bounded by $$2mD$$2mD, where $$D$$D is the diameter of G. In this paper we examine the dependence of the lock-in time on the initial configuration of the rotor---router mechanism. Our analysis is performed in the form of a game between a player $${\mathcal {P}}$$P intending to lock-in the agent in an Euler tour as quickly as possible and its adversary $${\mathcal {A}}$$A with the counter objective. We consider all cases of who decides the initial cyclic orders and the initial values $$\pi _v$$źv. We show, for example, that if $${\mathcal {A}}$$A provides its own port numbering after the initial setup of pointers by $${\mathcal {P}}$$P, the worst-case complexity of the lock-in problem is $${\varTheta }(m\cdot \min \{\log m,D\})$$ź(m·min{logm,D}). We also investigate the robustness of the rotor---router graph exploration in presence of faults in the pointers $$\pi _v$$źv or dynamic changes in the graph. We show, for example, that after the exploration establishes an Eulerian cycle, if k edges are added to the graph, then a new Eulerian cycle is established within $$\mathcal {O}(km)$$O(km) steps.

7 citations

Posted Content
TL;DR: This work disproves the conjecture that the period of parallel rotor-router walks can in fact, be superpolynomial in the size of graph, and provides a characterization of the periodic behavior of parallel router walks, in terms of a structural property of stable states called a subcycle decomposition.
Abstract: The rotor-router model, also called the Propp machine, was introduced as a deterministic alternative to the random walk. In this model, a group of identical tokens are initially placed at nodes of the graph. Each node maintains a cyclic ordering of the outgoing arcs, and during consecutive turns the tokens are propagated along arcs chosen according to this ordering in round-robin fashion. The behavior of the model is fully deterministic. Yanovski et al.(2003) proved that a single rotor-router walk on any graph with m edges and diameter $D$ stabilizes to a traversal of an Eulerian circuit on the set of all 2m directed arcs on the edge set of the graph, and that such periodic behaviour of the system is achieved after an initial transient phase of at most 2mD steps. The case of multiple parallel rotor-routers was studied experimentally, leading Yanovski et al. to the conjecture that a system of $k > 1$ parallel walks also stabilizes with a period of length at most $2m$ steps. In this work we disprove this conjecture, showing that the period of parallel rotor-router walks can in fact, be superpolynomial in the size of graph. On the positive side, we provide a characterization of the periodic behavior of parallel router walks, in terms of a structural property of stable states called a subcycle decomposition. This property provides us the tools to efficiently detect whether a given system configuration corresponds to the transient or to the limit behavior of the system. Moreover, we provide polynomial upper bounds of $O(m^4 D^2 + mD \log k)$ and $O(m^5 k^2)$ on the number of steps it takes for the system to stabilize. Thus, we are able to predict any future behavior of the system using an algorithm that takes polynomial time and space. In addition, we show that there exists a separation between the stabilization time of the single-walk and multiple-walk rotor-router systems, and that for some graphs the latter can be asymptotically larger even for the case of $k = 2$ walks.

6 citations

References
More filters
Book
01 Dec 2008
TL;DR: Markov Chains and Mixing Times as mentioned in this paper is an introduction to the modern approach to the theory of Markov chains and its application in the field of probability theory and linear algebra, where the main goal is to determine the rate of convergence of a Markov chain to the stationary distribution.
Abstract: This book is an introduction to the modern approach to the theory of Markov chains. The main goal of this approach is to determine the rate of convergence of a Markov chain to the stationary distribution as a function of the size and geometry of the state space. The authors develop the key tools for estimating convergence times, including coupling, strong stationary times, and spectral methods. Whenever possible, probabilistic methods are emphasized. The book includes many examples and provides brief introductions to some central models of statistical mechanics. Also provided are accounts of random walks on networks, including hitting and cover times, and analyses of several methods of shuffling cards. As a prerequisite, the authors assume a modest understanding of probability theory and linear algebra at an undergraduate level. ""Markov Chains and Mixing Times"" is meant to bring the excitement of this active area of research to a wide audience.

2,573 citations

Journal ArticleDOI
TL;DR: The operators corresponding to particle addition generate an Abelian group, same as the group for the Abelian sandpile model on the graph, and this equivalence determines the critical steady state and some critical exponents exactly.
Abstract: We propose a new model of self-organized criticality. A particle is dropped at random on a lattice and moves along directions specified by arrows at each site. As it moves, it changes the direction of the arrows according to fixed rules. On closed graphs these walks generate Euler circuits. On open graphs, the particle eventually leaves the system, and a new particle is then added. The operators corresponding to particle addition generate an Abelian group, same as the group for the Abelian sandpile model on the graph. We determine the critical steady state and some critical exponents exactly, using this equivalence.

192 citations

Proceedings ArticleDOI
08 Nov 1998
TL;DR: This work develops a general technique for the quantitative analysis of iterative distributed load balancing schemes, and applies this technique to obtain bounds on the number of rounds required to achieve coarse balancing in general networks, cycles and meshes in these models.
Abstract: We develop a general technique for the quantitative analysis of iterative distributed load balancing schemes. We illustrate the technique by studying two simple, intuitively appealing models that are prevalent in the literature: the diffusive paradigm, and periodic balancing circuits (or the dimension exchange paradigm). It is well known that such load balancing schemes can be roughly modeled by Markov chains, but also that this approximation can be quite inaccurate. Our main contribution is an effective way of characterizing the deviation between the actual loads and the distribution generated by a related Markov chain, in terms of a natural quantity which we call the local divergence. We apply this technique to obtain bounds on the number of rounds required to achieve coarse balancing in general networks, cycles and meshes in these models. For balancing circuits, we also present bounds for the stronger requirement of perfect balancing, or counting.

174 citations

Journal ArticleDOI
TL;DR: Jim Propp's P-machine is analysed, a simple deterministic process that simulates a random walk on Z to within a constant and the proof of the error bound relies on several estimates in the theory of simple random walks and some careful summing.
Abstract: We analyse Jim Propp's $P$-machine, a simple deterministic process that simulates a random walk on ${\mathbb Z}^d$ to within a constant. The proof of the error bound relies on several estimates in the theory of simple random walks and some careful summing. We mention three intriguing conjectures concerning sign-changes and unimodality of functions in the linear span of $\{p(\cdot,{\bf x}) : {\bf x} \in {\mathbb Z}^d\}$, where $p(n,{\bf x})$ is the probability that a walk beginning from the origin arrives at ${\bf x}$ at time $n$.

134 citations

Journal ArticleDOI
TL;DR: A simple multi-agent exploration algorithm is presented and it is shown that a single agent following this procedure enters, after a transient period, a periodic motion which is an extended Eulerian cycle, during which all edges are traversed an identical number of times.
Abstract: We consider the problem of patrolling—i.e. ongoing exploration of a network by a decentralized group of simple memoryless robotic agents. The model for the network is an undirected graph, and our goal, beyond complete exploration, is to achieve close to uniform frequency of traversal of the graph’s edges. A simple multi-agent exploration algorithm is presented and analyzed. It is shown that a single agent following this procedure enters, after a transient period, a periodic motion which is an extended Eulerian cycle, during which all edges are traversed an identical number of times. We further prove that if the network is Eulerian, a single agent goes into an Eulerian cycle within 2|E|D steps, |E| being the number of edges in the graph and D being its diameter. For a team of k agents, we show that after at most 2( 1 + 1/k) |E|D steps the numbers of edge visits in the network are balanced up to a factor of two. In addition, various aspects of the algorithm are demonstrated by simulations.

108 citations