Results 1  10
of
30
Speed is as Powerful as Clairvoyance
 Journal of the ACM
, 1995
"... We consider several well known nonclairvoyant scheduling problems, including the problem of minimizing the average response time, and besteffort firm realtime scheduling. It is known that there are no deterministic online algorithms for these problems with bounded (or even polylogarithmic in the n ..."
Abstract

Cited by 179 (23 self)
 Add to MetaCart
We consider several well known nonclairvoyant scheduling problems, including the problem of minimizing the average response time, and besteffort firm realtime scheduling. It is known that there are no deterministic online algorithms for these problems with bounded (or even polylogarithmic in the number of jobs) competitive ratios. We show that moderately increasing the speed of the processor used by the nonclairvoyant scheduler effectively gives this scheduler the power of clairvoyance. Furthermore, we show that there exist online algorithms with bounded competitive ratios on all inputs that are not closely correlated with processor speed. 1 Introduction We consider several well known nonclairvoyant scheduling problems, including the problem of minimizing the average response time [13, 15], and besteffort firm realtime scheduling [1, 2, 3, 4, 8, 11, 12, 18]. (We postpone formally defining these problems until the next section.) In nonclairvoyant scheduling some relevant information...
BEYOND COMPETITIVE ANALYSIS
, 2000
"... The competitive analysis of online algorithms has been criticized as being too crude and unrealistic. We propose refinements of competitive analysis in two directions: The first restricts the power of the adversary by allowingonly certain input distributions, while the other allows for comparisons ..."
Abstract

Cited by 118 (3 self)
 Add to MetaCart
The competitive analysis of online algorithms has been criticized as being too crude and unrealistic. We propose refinements of competitive analysis in two directions: The first restricts the power of the adversary by allowingonly certain input distributions, while the other allows for comparisons between information regimes for online decisionmaking. We illustrate the first with an application to the paging problem; as a byproduct we characterize completely the work functions of this important special case of the kserver problem. We use the second refinement to explore the power of lookahead in server and task systems.
On the Influence of Lookahead in Competitive Paging Algorithms
 ALGORITHMICA
, 1997
"... We introduce a new model of lookahead for online paging algorithms and study several algorithms using this model. A paging algorithm is online with strong lookahead l if it sees the present request and a sequence of future requests that contains l pairwise distinct pages. We show that strong look ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
We introduce a new model of lookahead for online paging algorithms and study several algorithms using this model. A paging algorithm is online with strong lookahead l if it sees the present request and a sequence of future requests that contains l pairwise distinct pages. We show that strong lookahead has practical as well as theoretical importance and improves the competitive factors of online paging algorithms. This is the first model of lookahead having such properties. In addition to lower bounds we present a number of deterministic and randomized online paging algorithms with strong lookahead which are optimal or nearly optimal.
Tight Bounds for Prefetching and Buffer Management Algorithms for Parallel I/O Systems
 In Foundations of Software Technology and Theoretical Computer Science
, 1996
"... . The growing importance of multipledisk parallel I/O systems requires the development of appropriate prefetching and buffer management algorithms. We answer several fundamental questions on prefetching and buffer management for such parallel I/O systems. Specifically, we find and prove the opt ..."
Abstract

Cited by 25 (11 self)
 Add to MetaCart
. The growing importance of multipledisk parallel I/O systems requires the development of appropriate prefetching and buffer management algorithms. We answer several fundamental questions on prefetching and buffer management for such parallel I/O systems. Specifically, we find and prove the optimality of an algorithm, PMIN, that minimizes the number of parallel I/Os. Secondly, we analyze PCON, an algorithm which always matches its replacement decisions with those of the wellknown demandpaged MIN algorithm. We show that PCON can become fully sequential in the worst case. Finally, we define and analyze PLRU, a semionline version of the traditional LRU buffermanagement algorithm. Unexpectedly, we find that the performance of PLRU is independent of the number of disks. 1 Introduction The increasing imbalance between the speeds of processors and I/O devices has resulted in the I/O subsystem becoming a bottleneck in many applications. The use of multiple disks to build...
The Statistical Adversary Allows Optimal MoneyMaking Trading Strategies (Extended Abstract)
, 1993
"... Andrew Chou Jeremy Cooperstock y Ran ElYaniv z Michael Klugerman x Tom Leighton  November, 1993 Abstract The distributional approach and competitive analysis have traditionally been used for the design and analysis of online algorithms. The former assumes a specific distribution on inputs, whil ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
Andrew Chou Jeremy Cooperstock y Ran ElYaniv z Michael Klugerman x Tom Leighton  November, 1993 Abstract The distributional approach and competitive analysis have traditionally been used for the design and analysis of online algorithms. The former assumes a specific distribution on inputs, while the latter assumes inputs are chosen by an unrestricted adversary. This paper employs the statistical adversary (recently proposed by Raghavan) to analyze and design online algorithms for twoway currency trading. The statistical adversary approach may be viewed as a hybrid of the distributional approach and competitive analysis. By statistical adversary, we mean an adversary that generates input sequences, where each sequence must satisfy certain general statistical properties. The online algorithms presented in this paper have some very attractive properties. For instance, the algorithms are moneymaking; they are guaranteed to be profitable when the optimal offli...
The relative worst order ratio for online algorithms
 In 5th Italian Conference on Algorithms and Complexity, volume 2653 of LNCS
, 2003
"... We define a new measure for the quality of online algorithms, the relative worst order ratio, using ideas from the Max/Max ratio (BenDavid & Borodin 1994) and from the random order ratio (Kenyon 1996). The new ratio is used to compare online algorithms directly by taking the ratio of their perfor ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
We define a new measure for the quality of online algorithms, the relative worst order ratio, using ideas from the Max/Max ratio (BenDavid & Borodin 1994) and from the random order ratio (Kenyon 1996). The new ratio is used to compare online algorithms directly by taking the ratio of their performances on their respective worst permutations of a worstcase sequence. Two variants of the bin packing problem are considered: the Classical Bin Packing problem, where the goal is to fit all items in as few bins as possible, and the Dual Bin Packing problem, which is the problem of maximizing the number of items packed in a fixed number of bins. Several known algorithms are compared using this new measure, and a new, simple variant of FirstFit is proposed for Dual Bin Packing. Many of our results are consistent with those previously obtained with the competitive ratio or the competitive ratio on accommodating sequences, but new separations and easier proofs are found.
The Accommodating Function  a generalization of the competitive ratio
 In Sixth International Workshop on Algorithms and Data Structures, volume 1663 of Lecture Notes in Computer Science
, 1998
"... A new measure, the accommodating function, for the quality of online algorithms is presented. The accommodating function, which is a generalization of both the competitive ratio and the accommodating ratio, measures the quality of an online algorithm as a function of the resources that would be su ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
A new measure, the accommodating function, for the quality of online algorithms is presented. The accommodating function, which is a generalization of both the competitive ratio and the accommodating ratio, measures the quality of an online algorithm as a function of the resources that would be sufficient for an optimal algorithm to fully grant all requests. More precisely, if we have some amount of resources n, the function value at ff is the usual ratio (still on some fixed amount of resources n), except that input sequences are restricted to those where all requests could have been fully granted by an optimal algorithm if it had had the amount of resources ffn. The accommodating functions for three specific online problems are investigated: a variant of binpacking in which the goal is to maximize the number of objects put in n bins, the seat reservation problem, and the problem of optimizing total flow time when preemption is allowed.
SellerFocused Algorithms for Online Auctioning
, 2001
"... In this paper we provide an algorithmic approach to the study of online auctioning. From the perspective of the seller we formalize the auctioning problem as that of designing an algorithmic strategy that fairly maximizes the revenue earned by selling n identical items to bidders who submit bids ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper we provide an algorithmic approach to the study of online auctioning. From the perspective of the seller we formalize the auctioning problem as that of designing an algorithmic strategy that fairly maximizes the revenue earned by selling n identical items to bidders who submit bids online.
The relative worst order ratio applied to seat reservation
 In SWAT: Scandinavian Workshop on Algorithm Theory
, 2004
"... Abstract. The relative worst order ratio is a new measure for the quality of online algorithms, which has been giving new separations and even new algorithms for a variety of problems. Here, we apply the relative worst order ratio to the seat reservation problem, the problem of assigning seats to pa ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Abstract. The relative worst order ratio is a new measure for the quality of online algorithms, which has been giving new separations and even new algorithms for a variety of problems. Here, we apply the relative worst order ratio to the seat reservation problem, the problem of assigning seats to passengers in a train. For the unit price problem, where all tickets have the same cost, we show that FirstFit and BestFit are better than WorstFit, even though they have not been separated using the competitive ratio. The same relative worst order ratio result holds for the proportional price problem, where the ticket price is proportional to the distance travelled. In contrast, no deterministic algorithm has a competitive ratio, or even a competitive ratio on accommodating sequences, which is bounded below by a constant. It is also shown that the worst order ratio for seat reservation algorithms is very closely related to the competitive ratio on accommodating sequences. 1
The relative worst order ratio applied to paging
 In Proceedings of the 16th ACMSIAM Symposium on Discrete Algorithms (SODA ’05
, 2005
"... Abstract. The relative worst order ratio, a new measure for the quality of online algorithms, was recently defined and applied to two bin packing problems. Here, we apply it to the paging problem. Work in progress by various researchers shows that the measure gives interesting results and new separ ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
Abstract. The relative worst order ratio, a new measure for the quality of online algorithms, was recently defined and applied to two bin packing problems. Here, we apply it to the paging problem. Work in progress by various researchers shows that the measure gives interesting results and new separations for bin coloring, scheduling, and seat reservation problems as well. Using the relative worst order ratio, we obtain the following results: We devise a new deterministic paging algorithm, RetrospectiveLRU, and show that it performs better than LRU. This is supported by experimental results, but contrasts with the competitive ratio. All deterministic marking algorithms have the same competitive ratio, but here we find that LRU is better than FWF. No deterministic marking algorithm can be significantly better than LRU, but the randomized algorithm MARK is better than LRU. Finally, lookahead is shown to be a significant advantage, in contrast to the competitive ratio, which does not reflect that lookahead can be helpful. 1