Results 1  10
of
30
The NPcompleteness column: an ongoing guide
 Journal of Algorithms
, 1985
"... This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & ..."
Abstract

Cited by 189 (0 self)
 Add to MetaCart
This is the nineteenth edition of a (usually) quarterly column that covers new developments in the theory of NPcompleteness. The presentation is modeled on that used by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NPCompleteness,’ ’ W. H. Freeman & Co., New York, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed, and, when appropriate, crossreferences will be given to that book and the list of problems (NPcomplete and harder) presented there. Readers who have results they would like mentioned (NPhardness, PSPACEhardness, polynomialtimesolvability, etc.) or open problems they would like publicized, should
Smoothed analysis of algorithms: why the simplex algorithm usually takes polynomial time
, 2003
"... We introduce the smoothed analysis of algorithms, which continuously interpolates between the worstcase and averagecase analyses of algorithms. In smoothed analysis, we measure the maximum over inputs of the expected performance of an algorithm under small random perturbations of that input. We me ..."
Abstract

Cited by 145 (14 self)
 Add to MetaCart
We introduce the smoothed analysis of algorithms, which continuously interpolates between the worstcase and averagecase analyses of algorithms. In smoothed analysis, we measure the maximum over inputs of the expected performance of an algorithm under small random perturbations of that input. We measure this performance in terms of both the input size and the magnitude of the perturbations. We show that the simplex algorithm has smoothed complexity polynomial in the input size and the standard deviation of
Hardtosolve bimatrix games
 ECONOMETRICA
, 2006
"... The Lemke–Howson algorithm is the classical method for finding one Nash equilibrium of a bimatrix game. This paper presents a class of square bimatrix games for which this algorithm takes, even in the best case, an exponential number of steps in the dimension d of the game. Using polytope theory, th ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
The Lemke–Howson algorithm is the classical method for finding one Nash equilibrium of a bimatrix game. This paper presents a class of square bimatrix games for which this algorithm takes, even in the best case, an exponential number of steps in the dimension d of the game. Using polytope theory, the games are constructed using pairs of dual cyclic polytopes with 2d suitably labeled facets in dspace. The construction is extended to nonsquare games where, in addition to exponentially long Lemke–Howson computations, finding an equilibrium by support enumeration takes on average exponential time.
Smoothed Analysis of Termination of Linear Programming Algorithms
"... We perform a smoothed analysis of a termination phase for linear programming algorithms. By combining this analysis with the smoothed analysis of Renegar’s condition number by Dunagan, Spielman and Teng ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
We perform a smoothed analysis of a termination phase for linear programming algorithms. By combining this analysis with the smoothed analysis of Renegar’s condition number by Dunagan, Spielman and Teng
Beyond Hirsch conjecture: Walks on random polytopes and smoothed complexity of the simplex method
 In Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
, 2006
"... Abstract. The smoothed analysis of algorithms is concerned with the expected running time of an algorithm under slight random perturbations of arbitrary inputs. Spielman and Teng proved that the shadowvertex simplex method has polynomial smoothed complexity. On a slight random perturbation of an ar ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
Abstract. The smoothed analysis of algorithms is concerned with the expected running time of an algorithm under slight random perturbations of arbitrary inputs. Spielman and Teng proved that the shadowvertex simplex method has polynomial smoothed complexity. On a slight random perturbation of an arbitrary linear program, the simplex method finds the solution after a walk on polytope(s) with expected length polynomial in the number of constraints n, the number of variables d and the inverse standard deviation of the perturbation 1/σ. We show that the length of walk in the simplex method is actually polylogarithmic in the number of constraints n. SpielmanTeng’s bound on the walk was O ∗ (n 86 d 55 σ −30), up to logarithmic factors. We improve this to O(log 7 n(d 9 + d 3 σ −4)). This shows that the tight Hirsch conjecture n − d on the length of walk on polytopes is not a limitation for the smoothed Linear Programming. Random perturbations create short paths between vertices. We propose a randomized phaseI for solving arbitrary linear programs, which is of independent interest. Instead of finding a vertex of a feasible set, we add a vertex at
CrissCross Methods: A Fresh View on Pivot Algorithms
 Mathematical Programming
, 1997
"... this paper is to present mathematical ideas and ..."
Frontiers of stochastically nondominated portfolios
 Econometrica
, 2003
"... Abstract. We consider the problem of constructing a portfolio of finitely many assets whose returns are described by a discrete joint distribution. We propose mean–risk models which are solvable by linear programming and generate portfolios whose returns are nondominated in the sense of secondorder ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
Abstract. We consider the problem of constructing a portfolio of finitely many assets whose returns are described by a discrete joint distribution. We propose mean–risk models which are solvable by linear programming and generate portfolios whose returns are nondominated in the sense of secondorder stochastic dominance. Next, we develop a specialized parametric method for recovering the entire mean–risk efficient frontiers of these models and we illustrate its operation on a large data set involving thousands of assets and realizations. 1.
SECURE AND PRIVATE COLLABORATIVE LINEAR PROGRAMMING
"... Abstract The growth of the Internet has created tremendous the negotiations, of actually exchanging the resources, etc.). opportunities for online collaborations. These often involve col Although there are so many situations where collaboration laborative optimizations where the two parties are, f ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract The growth of the Internet has created tremendous the negotiations, of actually exchanging the resources, etc.). opportunities for online collaborations. These often involve col Although there are so many situations where collaboration laborative optimizations where the two parties are, for example, is mutually advantageous, it often does not occur and its jointly minimizing costs without violating their own particular.. '... constraints (e.g., one party may have too much inventory, another potential goes unexploited, that is, the paricipants often do too little inventory but too much production capacity, etc). Many not engage in the trade even though its outcome would of these optimizations can be formulated as linear programming be mutually beneficial to both of them. This occurs when problems, or, rather, as collaborative linear programming, in which the online negotiation is "too revealing " of the participants' two parties need to jointly optimize based on their own pri private or proprietary data: That a company has a massive vate inputs. It is often important to have online collaboration... techniques and protocols that carry this out without either excess of bandwidth, inventory, production capacity, etc, can party revealing to the other anything about their own private be damaging to the company's future negotiating position (or inputs to the optimization (other than, unavoidably, what can even to its stock price and the survival of its management). The be deduced from the collaboratively computed optimal solution). formulation of such online collaboration often leads to a linearFor example, two organizations who jointly invest in a project programming formulation, and the task is then to solve this
IMPROVED ASYMPTOTIC ANALYSIS OF THE AVERAGE NUMBER OF STEPS PERFORMED BY THE SELFDUAL SIMPLEX ALGORITHM
, 1986
"... In this paper we analyze the average number of steps performed by the selfdual simplex algorithm for linear programming, under the probabilistic model of spherical symmetry. The model was proposed by Smale. Consider a problem of n variables with m constraints. Smale established that for every numbe ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper we analyze the average number of steps performed by the selfdual simplex algorithm for linear programming, under the probabilistic model of spherical symmetry. The model was proposed by Smale. Consider a problem of n variables with m constraints. Smale established that for every number of constraints m, there is a constant c(m) such that the number of pivot steps of the selfdual algorithm, p(m, n), is less than c(m)(ln n)&quot;&quot;&quot;'+&quot;. We improve upon this estimate by showing that p(m, n) is bounded by a function of m only. The symmetry of the function in m and n implies that p(m, n) is in fact bounded by a function of the smaller of m and n.