Results 1  10
of
114
An optimal online algorithm for metrical task systems
 Journal of the ACM
, 1992
"... Abstract. In practice, almost all dynamic systems require decisions to be made online, without full knowledge of their future impact on the system. A general model for the processing of sequences of tasks is introduced, and a general online decnion algorithm is developed. It is shown that, for an ..."
Abstract

Cited by 186 (9 self)
 Add to MetaCart
Abstract. In practice, almost all dynamic systems require decisions to be made online, without full knowledge of their future impact on the system. A general model for the processing of sequences of tasks is introduced, and a general online decnion algorithm is developed. It is shown that, for an important algorithms. class of special cases, this algorithm is optimal among all online Specifically, a task system (S. d) for processing sequences of tasks consists of a set S of states and a cost matrix d where d(i, j) is the cost of changing from state i to state j (we assume that d satisfies the triangle inequality and all diagonal entries are f)). The cost of processing a given task depends on the state of the system. A schedule for a sequence T1, T2,..., Tk of tasks is a ‘equence sl,s~,..., Sk of states where s ~ is the state in which T ’ is processed; the cost of a schedule is the sum of all task processing costs and state transition costs incurred. An online scheduling algorithm is one that chooses s, only knowing T1 Tz ~.. T’. Such an algorithm is wcompetitive if, on any input task sequence, its cost is within an additive constant of w times the optimal offline schedule cost. The competitive ratio w(S, d) is the infimum w for which there is a wcompetitive online scheduling algorithm for (S, d). It is shown that w(S, d) = 2 ISI – 1 for eoery task system in which d is symmetric, and w(S, d) = 0(1 S]2) for every task system. Finally, randomized online scheduling algorithms are introduced. It is shown that for the uniform task system (in which d(i, j) = 1 for all i, j), the expected competitive ratio w(S, d) =
Competitive Paging Algorithms
, 1991
"... The paging problem is that of deciding which pages to keep in a memory of k ..."
Abstract

Cited by 164 (22 self)
 Add to MetaCart
The paging problem is that of deciding which pages to keep in a memory of k
On the kServer Conjecture
 Journal of the ACM
, 1995
"... We prove that the work function algorithm for the kserver problem has competitive ratio at most 2k \Gamma 1. Manasse, McGeoch, and Sleator [24] conjectured that the competitive ratio for the kserver problem is exactly k (it is trivially at least k); previously the best known upper bound was ex ..."
Abstract

Cited by 95 (6 self)
 Add to MetaCart
We prove that the work function algorithm for the kserver problem has competitive ratio at most 2k \Gamma 1. Manasse, McGeoch, and Sleator [24] conjectured that the competitive ratio for the kserver problem is exactly k (it is trivially at least k); previously the best known upper bound was exponential in k. Our proof involves three crucial ingredients: A quasiconvexity property of work functions, a duality lemma that uses quasiconvexity to characterize the configurations that achieve maximum increase of the work function, and a potential function that exploits the duality lemma. 1 Introduction The kserver problem [24, 25] is defined on a metric space M, which is a (possibly infinite) set of points with a symmetric distance function d (nonnegative real function) that satisfies the triangle inequality: For all points x, y, and z d(x; x) = 0 d(x; y) = d(y; x) d(x; y) d(x; z) + d(z; y) 1 On the metric space M, k servers reside that can move from point to point. A possib...
Random Walks on Weighted Graphs, and Applications to Online Algorithms (Extended
 Journal of the ACM
, 1990
"... We study the design and analysis of randomized online algorithms. ..."
Abstract

Cited by 76 (2 self)
 Add to MetaCart
We study the design and analysis of randomized online algorithms.
New Results on Server Problems
 SIAM Journal on Discrete Mathematics
, 1990
"... In the kserver problem, we must choose how k mobile servers will serve each of a sequence of requests, making our decisions in an online manner. We exhibit an optimal deterministic online strategy when the requests fall on the real line. For the weightedcache problem, in which the cost of moving t ..."
Abstract

Cited by 73 (7 self)
 Add to MetaCart
In the kserver problem, we must choose how k mobile servers will serve each of a sequence of requests, making our decisions in an online manner. We exhibit an optimal deterministic online strategy when the requests fall on the real line. For the weightedcache problem, in which the cost of moving to x from any other point is w(x), the weight of x, we also provide an optimal deterministic algorithm. We prove the nonexistence of competitive algorithms for the asymmetric twoserver problem, and of memoryless algorithms for the weightedcache problem. We give a fast algorithm for offline computing of an optimal schedule, and show that finding an optimal offline schedule is at least as hard as the assignment problem. 1 Introduction The kserver problem can be stated as follows. We are given a metric space M , and k servers which move among the points of M , each occupying one point of M . Repeatedly, a request (a point x 2 M) appears. To serve x, each server moves some distance, possibly...
Online file caching
 In Proc. of the 9th Annual ACMSIAM Symp. on Discrete algorithms
, 1998
"... Consider the following file caching problem: in response to a sequence of requests for files, where each file has a specified size and retrieval cost, maintain a cache of files of total size at most some specified k so as to minimize the total retrieval cost. Specifically, when a requested file is n ..."
Abstract

Cited by 68 (2 self)
 Add to MetaCart
Consider the following file caching problem: in response to a sequence of requests for files, where each file has a specified size and retrieval cost, maintain a cache of files of total size at most some specified k so as to minimize the total retrieval cost. Specifically, when a requested file is not in the cache, bring it into the cache, pay the retrieval cost, and choose files to remove from the cache so that the total size of files in the cache is at most k. This problem generalizes previous paging and caching problems by allowing objects of arbitrary size and cost, both important attributes when caching files for worldwideweb browsers, servers, and proxies. We give a simple deterministic online algorithm that generalizes many wellknown paging and weightedcaching strategies, including leastrecentlyused, firstinfirstout,
The kserver problem
 Computer Science Review
"... The kserver problem is perhaps the most influential online problem: natural, crisp, with a surprising technical depth that manifests the richness of competitive analysis. The kserver conjecture, which was posed more that two decades ago when the problem was first studied within the competitive ana ..."
Abstract

Cited by 66 (5 self)
 Add to MetaCart
The kserver problem is perhaps the most influential online problem: natural, crisp, with a surprising technical depth that manifests the richness of competitive analysis. The kserver conjecture, which was posed more that two decades ago when the problem was first studied within the competitive analysis framework, is still open and has been a major driving force for the development of the area online algorithms. This article surveys some major results for the kserver. 1
Competitive Analysis of Randomized Paging Algorithms
, 2000
"... The paging problem is defined as follows: we are given a twolevel memory system, in which one level is a fast memory, called cache, capable of holding k items, and the second level is an unbounded but slow memory. At each given time step, a request to an item is issued. Given a request to an item p ..."
Abstract

Cited by 62 (9 self)
 Add to MetaCart
The paging problem is defined as follows: we are given a twolevel memory system, in which one level is a fast memory, called cache, capable of holding k items, and the second level is an unbounded but slow memory. At each given time step, a request to an item is issued. Given a request to an item p,amiss occurs if p is not present in the fast memory. In response to a miss, we need to choose an item q in the cache and replace it by p. The choice of q needs to be made online, without the knowledge of future requests. The objective is to design a replacement strategy with a small number of misses. In this paper we use competitive analysis to study the performance of randomized online paging algorithms. Our goal is to show how the concept of work functions, used previously mostly for the analysis of deterministic algorithms, can also be applied, in a systematic fashion, to the randomized case. We present two results: we first show that the competitive ratio of the marking algorithm is ex...
The KServer Dual and Loose Competitiveness for Paging
 Algorithmica
, 1994
"... Weighted caching is a generalization of paging in which the cost to ..."
Abstract

Cited by 61 (6 self)
 Add to MetaCart
Weighted caching is a generalization of paging in which the cost to
The server problem and online games
 Online Algorithms, volume 7 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1992
"... ..."