Results 1 
8 of
8
Separating online scheduling algorithms with the relative worst order ratio
 J COMB OPTIM
, 2006
"... ..."
(Show Context)
Hyperbolic dovetailing, in
 Proceedings of the 17th Annual European Symposium on Algorithms (ESA 2009), in: LNCS
, 2009
"... Abstract. A familiar quandary arises when there are several possible alternatives for the solution of a problem, but no way of knowing which, if any, are viable for a particular problem instance. Faced with this uncertainty, one is forced to simulate the parallel exploration of alternatives through ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. A familiar quandary arises when there are several possible alternatives for the solution of a problem, but no way of knowing which, if any, are viable for a particular problem instance. Faced with this uncertainty, one is forced to simulate the parallel exploration of alternatives through some kind of coordinated interleaving (dovetailing) process. As usual, the goal is to find a solution with low total cost. Much of the existing work on such problems has assumed, implicitly or explicitly, that at most one of the alternatives is viable, providing support for a competitive analysis of algorithms (using the cost of the unique viable alternative as a benchmark). In this paper, we relax this worstcase assumption in revisiting several familiar dovetailing problems. Our main contribution is the introduction of a novel process interleaving technique, called hyperbolic dovetailing that achieves a competitive ratio that is within a logarithmic factor of optimal on all inputs in the worst, average and expected cases, over all possible deterministic (and randomized) dovetailing schemes. We also show that no other dovetailing strategy can guarantee an asymptotically smaller competitive ratio for all inputs. An interesting application of hyperbolic dovetailing arises in the design of what we call inputthrifty algorithms, algorithms that are designed to minimize the total precision of the input requested in order to evaluate some given predicate. We show that for some very basic predicates involving real numbers we can use hyperbolic dovetailing to provide inputthrifty algorithms that are competitive, in this novel cost measure, with the best algorithms that solve these problems. 1
Adaptive Analysis of Online Algorithms
"... Online algorithms are usually analyzed using competitive analysis, in which the performance of online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an alternative general fram ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Online algorithms are usually analyzed using competitive analysis, in which the performance of online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an alternative general framework for the analysis of online algorithms. This model gives promising results when applied to two well known online problems, paging and list update. The idea is to normalize the performance of an online algorithm by a measure other than the performance of the offline optimal algorithm OPT. We show that in many instances the perform of OPT on a sequence is a coarse approximation of the difficulty or complexity of a given input. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise undistinguishable under the classical model. This createas a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the adaptive case. This confirms that the ability of the online adaptive algorithm to ignore pathological worst cases can lead to algorithms that are more efficient in practice.
Separating Scheduling Algorithms with the Relative Worst Order Ratio
"... Abstract. The relative worst order ratio is a measure for the quality of online algorithms. Unlike the competitive ratio, it compares algorithms directly without involving an optimal offline algorithm. The measure has been successfully applied to problems like paging and bin packing. In this paper, ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The relative worst order ratio is a measure for the quality of online algorithms. Unlike the competitive ratio, it compares algorithms directly without involving an optimal offline algorithm. The measure has been successfully applied to problems like paging and bin packing. In this paper, we apply it to machine scheduling. We show that for preemptive scheduling, the measure separates multiple pairs of algorithms which have the same competitive ratios; with the relative worst order ratio, the algorithm which is “intuitively better &quot; is also provably better. Moreover, we show one such example for nonpreemptive scheduling. 1
A comparison of performance measures for online algorithms
, 2008
"... Abstract. This paper provides a systematic study of several proposed measures for online algorithms in the context of a specific problem, namely, the two server problem on three colinear points. Even though the problem is simple, it encapsulates a core challenge in online algorithms which is to bala ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper provides a systematic study of several proposed measures for online algorithms in the context of a specific problem, namely, the two server problem on three colinear points. Even though the problem is simple, it encapsulates a core challenge in online algorithms which is to balance greediness and adaptability. We examine Competitive
Comparing Online Algorithms for Bin Packing Problems
"... Abstract The relative worstorder ratio is a measure of the quality of online algorithms. In contrast to the competitive ratio, this measure compares two online algorithms directly instead of using an intermediate comparison with an optimal offline algorithm. In this paper, we apply the relative wor ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract The relative worstorder ratio is a measure of the quality of online algorithms. In contrast to the competitive ratio, this measure compares two online algorithms directly instead of using an intermediate comparison with an optimal offline algorithm. In this paper, we apply the relative worstorder ratio to online algorithms for several common variants of the bin packing problem. We mainly consider pairs of algorithms that are not distinguished by the competitive ratio and show that the relative worstorder ratio prefers the intuitively better algorithm of each pair.
The Cooperative Ratio of Online Algorithms
, 2007
"... Online algorithms are usually analyzed using competitive analysis, in which the performance of an online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce cooperative analysis as an alternative general framework ..."
Abstract
 Add to MetaCart
Online algorithms are usually analyzed using competitive analysis, in which the performance of an online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce cooperative analysis as an alternative general framework for the analysis of online algorithms. The idea is to normalize the performance of an online algorithm by a measure other than the performance of the offline optimal algorithm OPT. We show that in many instances the perform of OPT on a sequence is a coarse approximation of the difficulty or complexity of a given input. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise indistinguishable under the classical model. This creates a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the cooperative case, which matches experimental results. This confirms that the ability of the online cooperative algorithm to ignore pathological worst cases can lead to algorithms that are more efficient in practice.
Adaptive Analysis of Online Algorithms
, 2007
"... Online algorithms are usually analyzed using competitive analysis, in which the performance of online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an alternative general fram ..."
Abstract
 Add to MetaCart
Online algorithms are usually analyzed using competitive analysis, in which the performance of online algorithm on a sequence is normalized by the performance of the optimal offline algorithm on that sequence. In this paper we introduce adaptive/cooperative analysis as an alternative general framework for the analysis of online algorithms. This model gives promising results when applied to two well known online problems, paging and list update. The idea is to normalize the performance of an online algorithm by a measure other than the performance of the offline optimal algorithm OPT. We show that in many instances the perform of OPT on a sequence is a coarse approximation of the difficulty or complexity of a given input. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise undistinguishable under the classical model. This createas a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the adaptive case. This confirms that the ability of the online adaptive algorithm to ignore pathological worst cases can lead to algorithms that are more efficient in practice.