Results 1  10
of
15
A randomized polynomialtime simplex algorithm for linear programming
 In STOC
, 2006
"... We present the first randomized polynomialtime simplex algorithm for linear programming. Like the other known polynomialtime algorithms for linear programming, its running time depends polynomially on the number of bits used to represent its input. We begin by reducing the input linear program to ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
We present the first randomized polynomialtime simplex algorithm for linear programming. Like the other known polynomialtime algorithms for linear programming, its running time depends polynomially on the number of bits used to represent its input. We begin by reducing the input linear program to a special form in which we merely need to certify boundedness. As boundedness does not depend upon the righthandside vector, we run the shadowvertex simplex method with a random righthandside vector. Thus, we do not need to bound the diameter of the original polytope. Our analysis rests on a geometric statement of independent interest: given a polytope Ax ≤ b in isotropic position, if one makes a polynomially small perturbation to b then the number of edges of the projection of the perturbed polytope onto a random 2dimensional subspace is expected to be polynomial. 1.
On the Existence of a Short Admissible Pivot Sequences for Feasibility and Linear Optimization Problems
, 1999
"... this paper, for the feasibility problem, we prove the existence of a short admissible pivot sequence from an arbitrary basis to a feasible basis. Regarding the general LP problem, the existence of a short admissible pivot sequence from an arbitrary basis to an optimal basis is proved without any non ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
this paper, for the feasibility problem, we prove the existence of a short admissible pivot sequence from an arbitrary basis to a feasible basis. Regarding the general LP problem, the existence of a short admissible pivot sequence from an arbitrary basis to an optimal basis is proved without any nondegeneracy assumptions. Our constructive proofs are based on techniques that are used in stronglypolynomial basis identification schemes of interior point methods. The result can be regarded as an admissible pivot version of the dstep
New Variants Of Finite CrissCross Pivot Algorithms For Linear Programming
, 1997
"... In this paper we generalize the socalled firstinlastout pivot rule and the mostoftenselectedvariable pivot rule for the simplex method, as proposed in Zhang [13], to the crisscross pivot setting where neither the primal nor the dual feasibility is preserved. The finiteness of the new crisscr ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper we generalize the socalled firstinlastout pivot rule and the mostoftenselectedvariable pivot rule for the simplex method, as proposed in Zhang [13], to the crisscross pivot setting where neither the primal nor the dual feasibility is preserved. The finiteness of the new crisscross pivot variants is proven.
Towards a Unified Framework for Randomized Pivoting Algorithms in Linear Programming
 In Operations Research Proceedings
, 1998
"... this paper we present a unified framework in which we describe two known algorithms as special simplex methods and analyse their complexities and differences ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
this paper we present a unified framework in which we describe two known algorithms as special simplex methods and analyse their complexities and differences
Linear Concurrent Constraint Programming Over Reals
, 1998
"... . We introduce a constraint system LC that handles arithmetic constraints over reals within the linear concurrent constraint programming (lcc) framework. This approach provides us with a general, extensible foundation for linear programming algorithm design that comes with a (linear) logical semant ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. We introduce a constraint system LC that handles arithmetic constraints over reals within the linear concurrent constraint programming (lcc) framework. This approach provides us with a general, extensible foundation for linear programming algorithm design that comes with a (linear) logical semantics. In particular, it allows us to build a `glassbox' version of the (constraint solver) simplex algorithm by defining (monotone) cc ask and tell agents over a higherlevel constraint system as lcc(LC) programs. We illustrate at the same time the use of the lccframework as a nontrivial concurrent algorithm specification tool. 1 Introduction Constraintbased programming languages are based on a functional separation between a program that successively generates pieces of partial information called constraints, and a constraint solver that collects, combines, simplifies and detects inconsistencies between these constraints. Initially, constraint solvers were monolithic programs written in...
Solving inequalities and proving Farkas’ lemma made easy
 Amer. Math. Monthly
"... Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité deleurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds F.C.A.R. ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Les textes publiés dans la série des rapports de recherche HEC n’engagent que la responsabilité deleurs auteurs. La publication de ces rapports de recherche bénéficie d’une subvention du Fonds F.C.A.R.
THE CENTRAL CURVE IN LINEAR PROGRAMMING
"... Abstract. The central curve of a linear program is an algebraic curve specified by linear and quadratic constraints arising from complementary slackness. It is the union of the various central paths for minimizing or maximizing the cost function over any region in the associated hyperplane arrangeme ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. The central curve of a linear program is an algebraic curve specified by linear and quadratic constraints arising from complementary slackness. It is the union of the various central paths for minimizing or maximizing the cost function over any region in the associated hyperplane arrangement. We determine the degree, arithmetic genus and defining prime ideal of the central curve, thereby answering a question of Bayer and Lagarias. These invariants, along with the degree of the Gauss image of the curve, are expressed in terms of the matroid of the input matrix. Extending work of Dedieu, Malajovich and Shub, this yields an instancespecific bound on the total curvature of the central path, a quantity relevant for interior point methods. The global geometry of central curves is studied in detail. 1.
CrissCross Pivoting Rules
"... . Assuming that the reader is familiar with both the primal and dual simplex methods, Zionts' crisscross method can easily be explained. ffl It can be initialized by any, possibly both primal and dual infeasible basis . If the basis is optimal, we are done. If the basis is not optimal , then th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. Assuming that the reader is familiar with both the primal and dual simplex methods, Zionts' crisscross method can easily be explained. ffl It can be initialized by any, possibly both primal and dual infeasible basis . If the basis is optimal, we are done. If the basis is not optimal , then there are some primal or dual infeasible variables. One might choose any of these. It is advised to choose once a primal and then a dual infeasible variable, if possible. ffl If the selected variable is dual infeasible, then it enters the basis and the leaving variable is chosen among the primal feasible variables in such a way that primal feasibility of the currently primal feasible variables is preserved. If no such basis exchange is possible another infeasible variable is selected. ffl If the selected variable is primal infeasible, then it leaves the basis and the entering variable is chosen among th
The Finite CrissCross Method for Hyperbolic Programming
 Informatica, Technische Universiteit Delft, The Netherlands
, 1996
"... In this paper the finite crisscross method is generalized to solve hyperbolic programming problems. Just as in the case of linear or quadratic programming the crisscross method can be initialized with any, not necessarily feasible basic solution. Finiteness of the procedure is proved under the ..."
Abstract
 Add to MetaCart
In this paper the finite crisscross method is generalized to solve hyperbolic programming problems. Just as in the case of linear or quadratic programming the crisscross method can be initialized with any, not necessarily feasible basic solution. Finiteness of the procedure is proved under the usual mild assumptions. Some small numerical examples illustrate the main features of the algorithm. Key words: hyperbolic programming, pivoting, crisscross method iii 1 Introduction The hyperbolic (fractional linear) programming problem is a natural generalization of the linear programming problem. The linear constraints are kept, but the linear objective function is replaced by a quotient of two linear functions. Such fractional linear objective functions arise in economical models when the goal is to optimize profit/allocation type functions (see for instance [12]). The objective function of the hyperbolic programming problem is neither linear nor convex, however there are several ...
LEASTINDEX ANTICYCLING RULES, LindAcR
, 1998
"... this paper. leastindex rules were designed for network flow problems, linear optimization problems, linear complementarity problems and oriented matroid programming problems. These classes will be considered in the sequel. ..."
Abstract
 Add to MetaCart
this paper. leastindex rules were designed for network flow problems, linear optimization problems, linear complementarity problems and oriented matroid programming problems. These classes will be considered in the sequel.