Results 1  10
of
28
Faster and simpler algorithms for multicommodity flow and other fractional packing problems
"... This paper considers the problem of designing fast, approximate, combinatorial algorithms for multicommodity flows and other fractional packing problems. We present new faster and much simpler algorithms for these problems. ..."
Abstract

Cited by 325 (5 self)
 Add to MetaCart
(Show Context)
This paper considers the problem of designing fast, approximate, combinatorial algorithms for multicommodity flows and other fractional packing problems. We present new faster and much simpler algorithms for these problems.
The Complexity of Pure Nash Equilibria
, 2004
"... We investigate from the computational viewpoint multiplayer games that are guaranteed to have pure Nash equilibria. We focus on congestion games, and show that a pure Nash equilibrium can be computed in polynomial time in the symmetric network case, while the problem is PLScomplete in general. ..."
Abstract

Cited by 169 (6 self)
 Add to MetaCart
(Show Context)
We investigate from the computational viewpoint multiplayer games that are guaranteed to have pure Nash equilibria. We focus on congestion games, and show that a pure Nash equilibrium can be computed in polynomial time in the symmetric network case, while the problem is PLScomplete in general. We discuss implications to nonatomic congestion games, and we explore the scope of the potential function method for proving existence of pure Nash equilibria.
Potential Function Methods for Approximately Solving Linear Programming Problems: Theory and Practice
, 2001
"... After several decades of sustained research and testing, linear programming has evolved into a remarkably reliable, accurate and useful tool for handling industrial optimization problems. Yet, large problems arising from several concrete applications routinely defeat the very best linear programming ..."
Abstract

Cited by 155 (4 self)
 Add to MetaCart
After several decades of sustained research and testing, linear programming has evolved into a remarkably reliable, accurate and useful tool for handling industrial optimization problems. Yet, large problems arising from several concrete applications routinely defeat the very best linear programming codes, running on the fastest computing hardware. Moreover, this is a trend that may well continue and intensify, as problem sizes escalate and the need for fast algorithms becomes more stringent. Traditionally, the focus in optimization algorithms, and in particular, in algorithms for linear programming, has been to solve problems "to optimality." In concrete implementations, this has always meant the solution ofproblems to some finite accuracy (for example, eight digits). An alternative approach would be to explicitly, and rigorously, trade o# accuracy for speed. One motivating factor is that in many practical applications, quickly obtaining a partially accurate solution is much preferable to obtaining a very accurate solution very slowly. A secondary (and independent) consideration is that the input data in many practical applications has limited accuracy to begin with. During the last ten years, a new body ofresearch has emerged, which seeks to develop provably good approximation algorithms for classes of linear programming problems. This work both has roots in fundamental areas of mathematical programming and is also framed in the context ofthe modern theory ofalgorithms. The result ofthis work has been a family ofalgorithms with solid theoretical foundations and with growing experimental success. In this manuscript we will study these algorithms, starting with some ofthe very earliest examples, and through the latest theoretical and computational developments.
Approximating Fractional Multicommodity Flow Independent of the Number of Commodities
, 1999
"... We describe fully polynomial time approximation schemes for various multicommodity flow problems in graphs with m edges and n vertices. We present the first approximation scheme for maximum multicommodity flow that is independent of the number of commodities k, and our algorithm improves upon the ru ..."
Abstract

Cited by 110 (8 self)
 Add to MetaCart
(Show Context)
We describe fully polynomial time approximation schemes for various multicommodity flow problems in graphs with m edges and n vertices. We present the first approximation scheme for maximum multicommodity flow that is independent of the number of commodities k, and our algorithm improves upon the runtime of previous algorithms by this factor of k, performing in O (ffl \Gamma2 m 2 ) time. For maximum concurrent flow, and minimum cost concurrent flow, we present algorithms that are faster than the current known algorithms when the graph is sparse or the number of commodities k is large, i.e. k ? m=n. Our algorithms build on the framework proposed by Garg and Konemann [4]. They are simple, deterministic, and for the versions without costs, they are strongly polynomial. Our maximum multicommodity flow algorithm extends to an approximation scheme for the maximum weighted multicommodity flow, which is faster than those implied by previous algorithms by a factor of k= log W where W is ...
Sequential and parallel algorithms for mixed packing and covering
 IN 42ND ANNUAL IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE
, 2001
"... We describe sequential and parallel algorithms that approximately solve linear programs with no negative coefficients (a.k.a. mixed packing and covering problems). For explicitly given problems, our fastest sequential algorithm returns a solution satisfying all constraints within a ¦ ¯ factor in Ç ..."
Abstract

Cited by 67 (6 self)
 Add to MetaCart
(Show Context)
We describe sequential and parallel algorithms that approximately solve linear programs with no negative coefficients (a.k.a. mixed packing and covering problems). For explicitly given problems, our fastest sequential algorithm returns a solution satisfying all constraints within a ¦ ¯ factor in Ç Ñ � ÐÓ � Ñ � ¯ time, where Ñ is the number of constraints and � is the maximum number of constraints any variable appears in. Our parallel algorithm runs in time polylogarithmic in the input size times ¯ � and uses a total number of operations comparable to the sequential algorithm. The main contribution is that the algorithms solve mixed packing and covering problems (in contrast to pure packing or pure covering problems, which have only “� ” or only “� ” inequalities, but not both) and run in time independent of the socalled width of the problem.
A Bundle Type DualAscent Approach to Linear Multicommodity MinCost Flow Problems
"... We present a Cost Decomposition approach for the linear Multicommodity MinCost Flow problem, where the mutual capacity constraints are dualized and the resulting Lagrangean Dual is solved with a dualascent algorithm belonging to the class of Bundle methods. Although decomposition approaches to bl ..."
Abstract

Cited by 36 (16 self)
 Add to MetaCart
We present a Cost Decomposition approach for the linear Multicommodity MinCost Flow problem, where the mutual capacity constraints are dualized and the resulting Lagrangean Dual is solved with a dualascent algorithm belonging to the class of Bundle methods. Although decomposition approaches to blockstructured Linear Programs have been reported not to be competitive with generalpurpose software, our extensive computational comparison shows that, when carefully implemented, a decomposition algorithm can outperform several other approaches, especially on problems where the number of commodities is “large ” with respect to the size of the graph. Our specialized Bundle algorithm is characterized by a new heuristic for the trust region parameter handling, and embeds a specialized Quadratic Program solver that allows the efficient implementation of strategies for reducing the number of active Lagrangean variables. We also exploit the structural properties of the singlecommodity MinCost Flow subproblems to reduce the overall computational cost. The proposed approach can be easily extended to handle variants of the problem.
An Implementation of a Combinatorial Approximation Algorithm for MinimumCost Multicommodity Flow
, 1997
"... The minimumcost multicommodity flow problem involves simultaneously shipping multiple commodities through a single network so that the total flow obeys arc capacity constraints and has minimum cost. ..."
Abstract

Cited by 32 (3 self)
 Add to MetaCart
(Show Context)
The minimumcost multicommodity flow problem involves simultaneously shipping multiple commodities through a single network so that the total flow obeys arc capacity constraints and has minimum cost.
Faster Approximation Schemes for Fractional Multicommodity Flow Problems
"... We present fully polynomial approximation schemes for concurrent multicommodity flow problems that run in time of minimum possible dependency on the number of commodities k. We showthat by modifying the algorithms by Garg & K"onemann [7] and Fleischer [5] we can reduce their running ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
(Show Context)
We present fully polynomial approximation schemes for concurrent multicommodity flow problems that run in time of minimum possible dependency on the number of commodities k. We showthat by modifying the algorithms by Garg & K&quot;onemann [7] and Fleischer [5] we can reduce their running time on a graph with n vertices and m edges from ~O(&quot;2(m2 + km)) to ~O(&quot;2m2) foran implicit representation of the output, or ~ O(&quot;2(m2 + kn)) for an explicit representation, where ~ O(f) denotes a quantity that is O(f logO(1) m). The implicit representation consists of a set oftrees rooted at sources (there can be more than one tree per source), and with sinks as their leaves, together with flow values for the flow directed from the source to the sinks in a particular tree.Given this implicit representation, the approximate value of the concurrent flow is known, but if we want the explicit flow per commodity per edge, we would have to combine all these trees together,and the cost of doing so may be prohibitive. In case we want to calculate explicitly the solution flow, we modify our schemes so that they run in time polylogarithmic in nk (n is the numberof nodes in the network). This is within a polylogarithmic factor of the trivial lower bound of time \Omega (nk) needed to explicitly write down a multicommodity flow of k commodities in a network of n nodes. Therefore our schemes are within a polylogarithmic factor of the minimum possible dependency of the running time on the number of commodities k.
Approximation Algorithms for General Packing Problems with Modified Logarithmic Potential Function
, 2002
"... In this paper we present an approximation algorithm based on a Lagrangian decomposition via a logarithmic potential reduction to solve a general packing or minmax resource sharing problem with M nonnegative convex constraints on a convex set B. We generalize a method by Grigoriadis et al to the cas ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
In this paper we present an approximation algorithm based on a Lagrangian decomposition via a logarithmic potential reduction to solve a general packing or minmax resource sharing problem with M nonnegative convex constraints on a convex set B. We generalize a method by Grigoriadis et al to the case with weak approximate block solvers (i.e. with only constant, logarithmic or even worse approximation ratios). We show that the algorithm needs at most O(M(" calls to the block solver, a bound independent of the data and the approximation ratio of the block solver. For small approximation ratios the algorithm needs at most O(M(" +lnM)) calls to the block solver. 1.
Online EndtoEnd Congestion Control
 IEEE Foundations of Computer Science
, 2002
"... Congestion control in the current Internet is accomplished mainly by TCP/IP. To understand the macroscopic network behavior that results from TCP/IP and similar endtoend protocols, one main analytic technique is to show that the the protocol maximizes some global objective function of the network t ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Congestion control in the current Internet is accomplished mainly by TCP/IP. To understand the macroscopic network behavior that results from TCP/IP and similar endtoend protocols, one main analytic technique is to show that the the protocol maximizes some global objective function of the network traffic. Here we analyze a particular endtoend, MIMD (multiplicativeincrease, multiplicativedecrease) protocol. We show that if all users of the network use the protocol, and all connections last for at least logarithmically many rounds, then the total weighted throughput (value of all packets received) is near the maximum possible. Our analysis includes roundtriptimes, and (in contrast to most previous analyses) gives explicit convergence rates, allows connections to start and stop, and allows capacities to change. 1. Congestion control and optimization Congestion control in the current Internet is accomplished mainly by TCP/IP — 90 % of Internet traffic is TCPbased [41]. Meanwhile the design and analysis of TCP and other endtoend congestioncontrol protocols are only partially understood and are becoming the subject of increasing attention [25, 28]. One main analytic technique is to interpret the protocol as solving some underlying combinatorial optimization problem on the network — to show that the protocol causes the traffic distribution, over time, to optimize some global objective function