scispace - formally typeset
Open AccessJournal ArticleDOI

Competitive paging algorithms

Reads0
Chats0
TLDR
The marking algorithm is developed, a randomized on-line algorithm for the paging problem, which it is proved that its expected cost on any sequence of requests is within a factor of 2Hk of optimum.
About
This article is published in Journal of Algorithms.The article was published on 1991-12-01 and is currently open access. It has received 489 citations till now. The article focuses on the topics: Page replacement algorithm & K-server problem.

read more

Citations
More filters
Posted Content

Randomized Work-Competitive Scheduling for Cooperative Computing on $k$-partite Task Graphs

TL;DR: A fundamental problem in distributed computing is the problem of cooperatively executing a given set of tasks in a dynamic setting and the challenge is to minimize the total work done and to maintain efficiency in the face of dynamically changing processor connectivity.
Posted Content

The Randomized Competitive Ratio of Weighted $k$-server is at least Exponential

TL;DR: This paper cuts down the triply exponential gap between the upper and lower bound to a singly exponential gap by proving that the competitive ratio is at least exponential in k, substantially improving on the previously known lower bound of about ln k.
Journal ArticleDOI

Online Service with Delay

TL;DR: In this article, the authors introduce the online service with delay problem, where the goal is to minimize the sum of distance traveled by the server and the total delay in serving the requests.
Journal ArticleDOI

Work-Competitive Scheduling on Task Dependency Graphs

TL;DR: This paper presents a simple randomized algorithm for p processors cooperating to perform t known tasks where the dependencies between them are defined by a k-partite task dependency graph and additionally these processors are subject to a dynamic communication medium.
Book ChapterDOI

Randomized Competitive Analysis for Two-Server Problems

TL;DR: It is proved that there exits a randomized online algorithm for the 2-server 3-point problem whose expected competitive ratio is at most 1.5897, the first nontrivial upper bound for randomized k-server algorithms in a general metric space whose competitive ratios are well below the corresponding deterministic lower bound.
References
More filters
Journal ArticleDOI

Amortized efficiency of list update and paging rules

TL;DR: This article shows that move-to-front is within a constant factor of optimum among a wide class of list maintenance rules, and analyzes the amortized complexity of LRU, showing that its efficiency differs from that of the off-line paging rule by a factor that depends on the size of fast memory.
Proceedings ArticleDOI

Probabilistic computations: Toward a unified measure of complexity

TL;DR: Two approaches to the study of expected running time of algoritruns lead naturally to two different definitions of intrinsic complexity of a problem, which are the distributional complexity and the randomized complexity, respectively.
Journal ArticleDOI

Competitive snoopy caching

TL;DR: This work presents new on-line algorithms to be used by the caches of snoopy cache multiprocessor systems to decide which blocks to retain and which to drop in order to minimize communication over the bus.
Journal ArticleDOI

Competitive algorithms for server problems

TL;DR: This paper seeks to develop on-line algorithms whose performance on any sequence of requests is as close as possible to the performance of the optimum off-line algorithm.
Proceedings ArticleDOI

Competitive algorithms for on-line problems

TL;DR: This paper presents several general results concerning competitive algorithms, as well as results on specific on-line problems.
Related Papers (5)
Frequently Asked Questions (11)
Q1. What are the contributions mentioned in the paper "Competitive paging algorithms" ?

In this paper, the authors proposed a method for the analysis of the relationship between computer science degrees and their application in the field of artificial intelligence. 

Karlin et al. [8] have shown that for two servers in a graph that is an isosceles triangle the best competitive factor that can be achieved is a constant that approaches e/(e - 1) z 1.582 as the length of the similar sides go to infinity. 

A randomized on-line algorithm may be viewed as basing its actions on the request sequence (T presented to it and on an infinite sequence p of independent unbiased random bits. 

The marking algorithm is strongly competitive (its competitive factor is Hk) if k = n - 1, but it is not strongly competitive if k < n - 1. 

They showed that LRU running with k servers performs within a factor of k/(k - h + 1) of any off-line algorithm with h 5 k servers and that this is the minimum competitive factor that can be achieved. 

They showed that no deterministic algorithm for the k-server problem can be better than k-competitive, they gave k-competitive algorithms for the case when k = 2 and k = II - 1, and they conjectured that there exists a k-competitive k-server algorithm for any graph. 

The adversary is, however, able to maintain a vector p = (pl, p2,. . . , p,) of probabilities, where pi is the probability that vertex i is not covered by a server. 

In that proof, deterministic on-line algorithms B(l), B(2), . . . , B(m) of type (k, n) were given, and the deterministic on-line algorithm A of type (k, n) was constructed to be &)-competitive against B(i) for each i. 

If the total expected cost ends up exceeding l/u, then an arbitrary request is made to an unmarked vertex, and the subphase is over. 

During this phase exactly the vertices of S were requested, so since A is lazy, the authors know that at least d’ of A’s servers were outside of S during the entire phase. 

Armed with these tools (the marking and the probability vector), the adversary can generate a sequence such that the expected cost of each phase to A is H,,-l, and the cost to the optimum off-line algorithm is 1.