scispace - formally typeset
Search or ask a question

Showing papers by "Guido Schäfer published in 2003"


Proceedings ArticleDOI
11 Oct 2003
TL;DR: In this paper, the authors introduced the notion of smoothed competitive analysis of online algorithms and applied it to analyze the Multi-Level Feedback algorithm to minimize the total flow time on a sequence of jobs released over time when the processing time of a job is only known at time of completion.
Abstract: In this paper, we introduce the notion of smoothed competitive analysis of online algorithms. Smoothed analysis has been proposed by Spielman and Teng (2001) to explain the behavior of algorithms that work well in practice while performing very poorly from a worst case analysis point of view. We apply this notion to analyze the Multi-Level Feedback (MLF) algorithm to minimize the total flow time on a sequence of jobs released over time when the processing time of a job is only known at time of completion. The initial processing times are integers in the range [1,2/sup K/] We use a partial bit randomization model, where the initial processing times are smoothened by changing the k least significant bits under a quite general class of probability distributions. We show that MLF admits a smoothed competitive ratio of O((2/sup k///spl sigma/)/sup 3/ + (2/sup k///spl sigma/)/sup 2/2/sup K-k/), where /spl sigma/ denotes the standard deviation of the distribution. In particular, we obtain a competitive ratio of O(2/sup K-k/) if /spl sigma/ = /spl Theta/(2/sup k/). We also prove an /spl Omega/(2/sup K-k/) lower bound for any deterministic algorithm that is run on processing times smoothened according to the partial bit randomization model. For various other smoothening models, we give a higher lower bound of /spl Omega/(2/sup K/). A direct consequence of our result is also the first average case analysis of MLF. We show a constant expected ratio of the total flow time of MLF to the optimum under several distributions including the uniform distribution.

35 citations


Journal ArticleDOI
TL;DR: A heuristic is described that leads to a significant improvement in running time for the weighted matching problem in directed graphs with non-negative edge weights and a partial analysis is presented that gives some theoretical support for the experimental findings.
Abstract: We consider the single-source many-targets shortest-path (SSMTSP) problem in directed graphs with non-negative edge weights. A source node s and a target set T is specified and the goal is to compute a shortest path from s to a node in T . Our interest in the shortest path problem with many targets stems from its use in weighted bipartite matching algorithms. A weighted bipartite matching in a graph with n nodes on each side reduces to n SSMTSP problems, where the number of targets varies between n and 1 . The SSMTSP problem can be solved by Dijkstra's algorithm. We describe a heuristic that leads to a significant improvement in running time for the weighted matching problem; in our experiments a speed-up by up to a factor of 12 was achieved. We also present a partial analysis that gives some theoretical support for our experimental findings.

14 citations


01 Jan 2003
TL;DR: This work proves a smoothed complexity of w w "| | if the last bits of each edge cost are replaced by some random number chosen from 7 6 according to some arbitrary probability distribution whose expectation is not too close to zero.
Abstract: Banderier, Beier and Mehlhorn [BBM03] showed that the single-source shortest path problem has smoothed complexity w w "| | if the edge costs are -bit integers and the last least significant bits are perturbed randomly. Their analysis holds if each bit is set to or 6 with probability  . We extend their result and show that the same analysis goes through for a large class of probability distributions: We prove a smoothed complexity of w w | | if the last bits of each edge cost are replaced by some random number chosen from 7 6 according to some arbitrary probability distribution whose expectation is not too close to zero. We do not require that the edge costs are perturbed independently. The same time bound holds even if the random perturbations are heterogeneous. If our analysis implies a linear average case running time for various probability distributions. We also show that the running time is w w "| | with high probability if the random replacements are chosen independently.

3 citations


Proceedings ArticleDOI
22 Apr 2003
TL;DR: This survey reviews work on scheduling to minimize average flow time or related metrics, on single and parallel machines, in an abstract model, and provides an overview of results and main open issues for average stretch and weighted flow time.
Abstract: Scheduling on multiple machines is a classical problem in scheduling and dates back to the 60's. In this survey we review work on scheduling to minimize average flow time or related metrics, on single and parallel machines. We consider an abstract model, in which a set of jobs is presented on line to a set of identical machines. Each job has a processing time and has to be processed, possibly over a noncontinuous interval, for an overall amount of time equal to its processing time. All techniques that we present have been initially applied to average flow time, while some of them have also been used to prove competitiveness results for average stretch and weighted flow time. For this reason, our focus is mainly be on average flow time, while we only provide an overview of results and main open issues for average stretch and weighted flow time.

2 citations