Results 1 
4 of
4
A Tutorial on Amortized Local Competitiveness in Online Scheduling
, 2011
"... potential functions are used to show that a particular online algorithm is locally competitive in an amortized sense. Algorithm analyses using potential functions are sometimes criticized as seeming to be black magic ..."
Abstract

Cited by 17 (14 self)
 Add to MetaCart
(Show Context)
potential functions are used to show that a particular online algorithm is locally competitive in an amortized sense. Algorithm analyses using potential functions are sometimes criticized as seeming to be black magic
SELFISHMIGRATE: A scalable algorithm for nonclairvoyantly scheduling heterogeneous processors
, 2014
"... ar ..."
Online Nonclairvoyant Scheduling to Simultaneously Minimize All Convex Functions
"... We consider scheduling jobs online to minimize the objective i∈[n] wig(Ci − ri), where wi is the weight of job i, ri is its release time, Ci is its completion time and g is any nondecreasing convex function. Previously, it was known that the clairvoyant algorithm HighestDensityFirst (HDF) is (2 + ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
We consider scheduling jobs online to minimize the objective i∈[n] wig(Ci − ri), where wi is the weight of job i, ri is its release time, Ci is its completion time and g is any nondecreasing convex function. Previously, it was known that the clairvoyant algorithm HighestDensityFirst (HDF) is (2 + ɛ)speed O(1)competitive for this objective on a single machine for any fixed 0 < ɛ < 1 [21]. We show the first nontrivial results for this problem when g is not concave and the algorithm must be nonclairvoyant. More specifically, our results include: • A (2 + ɛ)speed O(1)competitive nonclairovyant algorithm on a single machine for all nondecreasing convex g, matching the performance of HDF for any fixed 0 < ɛ < 1. • A (3 + ɛ)speed O(1)competitive nonclairovyant algorithm on multiple identical machines for all nondecreasing convex g for any fixed 0 < ɛ < 1. Our positive result on multiple machines is the first nontrivial one even when the algorithm is clairvoyant. Interestingly, all performance guarantees above hold for all nondecreasing convex functions g simultaneously. We supplement our positive results by showing any algorithm that is oblivious to g is not O(1)competitive with speed less than 2 on a single machine. Further, any nonclairvoyent algorithm that knows the function g cannot be O(1)competitive with speed less than √ 2 on a single on m identical machines. machine or speed less than 2 − 1 m 1
Resource Augmentation for Weighted Flowtime explained by Dual Fitting
"... We propose a general dualfitting technique for analyzing online scheduling algorithms in the unrelated machines setting where the objective function involves weighted flowtime, and we allow the machines of the online algorithm to have (1 + ε)extra speed than the offline optimum (the socalled sp ..."
Abstract
 Add to MetaCart
(Show Context)
We propose a general dualfitting technique for analyzing online scheduling algorithms in the unrelated machines setting where the objective function involves weighted flowtime, and we allow the machines of the online algorithm to have (1 + ε)extra speed than the offline optimum (the socalled speed augmentation model). Typically, such algorithms are analyzed using nontrivial potential functions which yield little insight into the proof technique. We propose that one can often analyze such algorithms by looking at the dual (or Lagrangian dual) of the linear (or convex) program for the corresponding scheduling problem, and finding a feasible dual solution as the online algorithm proceeds. As representative cases, we get the following results: • For the problem of minimizing weighted flowtime, we show that the greedy algorithm of ChadhaGargKumarMuralidhara is O () 1competitive. This is an