Results 1 - 10
of
21
An update on the Hirsch conjecture,
- Jahresber. Dtsch. Math.-Ver.
, 2010
"... Abstract The Hirsch Conjecture (1957) stated that the graph of a d-dimensional polytope with n facets cannot have (combinatorial) diameter greater than n − d. That is, any two vertices of the polytope can be connected by a path of at most n − d edges. This paper presents the first counterexample t ..."
Abstract
-
Cited by 41 (3 self)
- Add to MetaCart
(Show Context)
Abstract The Hirsch Conjecture (1957) stated that the graph of a d-dimensional polytope with n facets cannot have (combinatorial) diameter greater than n − d. That is, any two vertices of the polytope can be connected by a path of at most n − d edges. This paper presents the first counterexample to the conjecture. Our polytope has dimension 43 and 86 facets. It is obtained from a 5-dimensional polytope with 48 facets that violates a certain generalization of the d-step conjecture of Klee and Walkup.
A subexponential lower bound for Zadeh’s pivoting rule for solving linear programs and games
"... ..."
The simplex method is strongly polynomial for deterministic Markov Decision Processes
- In Proceedings of the 24th ACM-SIAM Symposium on Discrete Algorithms, SODA
, 2013
"... We prove that the simplex method with the highest gain/most-negative-reduced cost pivoting rule converges in strongly polynomial time for deterministic Markov decision processes (MDPs) regardless of the discount factor. For a deterministic MDP with n states and m actions, we prove the simplex method ..."
Abstract
-
Cited by 6 (0 self)
- Add to MetaCart
(Show Context)
We prove that the simplex method with the highest gain/most-negative-reduced cost pivoting rule converges in strongly polynomial time for deterministic Markov decision processes (MDPs) regardless of the discount factor. For a deterministic MDP with n states and m actions, we prove the simplex method runs in O(n3m2 log2 n) iterations if the discount factor is uniform and O(n5m3 log2 n) iterations if each action has a distinct discount factor. Previously the simplex method was known to run in polynomial time only for discounted MDPs where the discount was bounded away from 1 [Ye11]. Unlike in the discounted case, the algorithm does not greedily converge to the optimum, and we require a more complex measure of progress. We identify a set of layers in which the values of primal variables must lie and show that the simplex method always makes progress optimizing one layer, and when the upper layer is updated the algorithm makes a substantial amount of progress. In the case of nonuniform discounts, we define a polynomial number of “milestone” policies and we prove that, while the objective function may not improve substantially overall, the value of at least one dual variable is always making progress towards some milestone, and the algorithm will reach the next milestone in a polynomial number of steps. 1
Recent progress on the combinatorial diameter of polytopes and simplicial complexes
, 2013
"... ..."
A subexponential lower bound for the Least Recently Considered rule for solving linear programs and games
"... The simplex algorithm is among the most widely used algorithms for solving linear programs in practice. Most pivoting rules are known, however, to need an exponential number of steps to solve some linear programs. No non-polynomial lower bounds were known, prior to this work, for Cunningham’s Least ..."
Abstract
-
Cited by 3 (1 self)
- Add to MetaCart
(Show Context)
The simplex algorithm is among the most widely used algorithms for solving linear programs in practice. Most pivoting rules are known, however, to need an exponential number of steps to solve some linear programs. No non-polynomial lower bounds were known, prior to this work, for Cunningham’s Least Recently Considered rule [5], which belongs to the family of history-based rules. Also known as the ROUND-ROBIN rule, Cunningham’s pivoting method fixes an initial ordering on all variables first, and then selects the improving variables in a round-robin fashion. We provide the first subexponential (i.e., of the form 2 Ω( √ n)) lower bound for this rule in a concrete setting. Our lower bound is obtained by utilizing connections between pivoting steps performed by simplex-based algorithms and improving switches performed by policy iteration algorithms for 1-player and 2-player games. We start by building 2-player parity games (PGs) on which the policy iteration with the ROUND-ROBIN rule performs a subexponential number of iterations. We then transform the parity games into 1-player Markov Decision Processes (MDPs) which correspond almost immediately to concrete linear programs. 1
Errata for: A subexponential lower bound for the Random Facet algorithm for Parity Games
, 2014
"... In [Friedmann, Hansen, and Zwick (2011)] and we claimed that the expected number of pivoting steps performed by the Random-Facet algorithm of Kalai and of Matoušek, Sharir, and Welzl is equal to the expected number of pivoting steps performed by Random-Facet∗, a variant of Random-Facet that bases i ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
In [Friedmann, Hansen, and Zwick (2011)] and we claimed that the expected number of pivoting steps performed by the Random-Facet algorithm of Kalai and of Matoušek, Sharir, and Welzl is equal to the expected number of pivoting steps performed by Random-Facet∗, a variant of Random-Facet that bases its random decisions on one random permutation. We then obtained a lower bound on the expected number of pivoting steps performed by Random-Facet ∗ and claimed that the same lower bound holds also for Random-Facet. Unfortunately, the claim that the expected number of steps performed by Random-Facet and Random-Facet ∗ are the same is false. We provide here simple examples that show that the expected number of steps performed by the two algorithms is not the same. 1
N.K.: Towards polynomial simplex-like algorithms for market equilibria
, 2013
"... In this paper we consider the problem of computing mar-ket equilibria in the Fisher setting for utility models such as spending constraint and perfect, price-discrimination. These models were inspired from modern e-commerce settings and attempt to bridge the gap between the computationally hard but ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
In this paper we consider the problem of computing mar-ket equilibria in the Fisher setting for utility models such as spending constraint and perfect, price-discrimination. These models were inspired from modern e-commerce settings and attempt to bridge the gap between the computationally hard but realistic separable, piecewise-linear and concave utility model and, the tractable but less relevant linear utility case. While there are polynomial time algorithms known for these problems, the question of whether there exist polynomial time Simplex-like algorithms has remained elusive, even for linear markets. Such algorithms are desirable due to their conceptual simplicity, ease of implementation and practical-ity. This paper takes a significant step towards this goal by presenting the first Simplex-like algorithms for these mar-kets assuming a positive resolution of an algebraic problem of Cucker, Koiran and Smale. Unconditionally, our algo-rithms are FPTASs; they compute prices and allocations such that each buyer derives at least a 1 1+ε-fraction of the utility at a true market equilibrium, and their running times are polynomial in the input length and 1/ε. We start with convex programs which capture market equilibria in each setting and, in a systematic way, convert them into linear complementarity problem (LCP) formula-tions. Then, departing from previous approaches which try to pivot on a single polyhedron associated to the LCP ob-tained, we carefully construct a polynomial-length sequence of polyhedra, one containing the other, such that starting from an optimal solution to one allows us to obtain an op-timal solution to the next in the sequence in a polynomial number of complementary pivot steps. Our framework to convert a convex program into an LCP and then come up with a Simplex-like algorithm that moves on a sequence of connected polyhedra may be of independent interest. 1