Results 1 
6 of
6
CrissCross Methods: A Fresh View on Pivot Algorithms
 Mathematical Programming
, 1997
"... this paper is to present mathematical ideas and ..."
A Strongly Polynomial Rounding Procedure Yielding a Maximally Complementary Solution for P*(κ) Linear Complementarity Problems
, 1998
"... We deal with Linear Complementarity Problems (LCPs) with P () matrices. First we establish the convergence rate of the complementary variables along the central path. The central path is parameterized by the barrier parameter , as usual. Our elementary proof reproduces the known result that the var ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
We deal with Linear Complementarity Problems (LCPs) with P () matrices. First we establish the convergence rate of the complementary variables along the central path. The central path is parameterized by the barrier parameter , as usual. Our elementary proof reproduces the known result that the variables on, or close to the central path fall apart in three classes in which these variables are O(1); O() and O( p ), respectively. The constants hidden in these bounds are expressed in, or bounded by, the input data. All this is preparation for our main result: a strongly polynomial rounding procedure. Given a point with sufficiently small complementarity gap and close enough to the central path, the rounding procedure produces a maximally complementary solution in at most O(n³) arithmetic operations. The result implies that Interior Point Methods (IPMs) not only converge to a complementary solution of P () LCPs but, when furnished with our rounding procedure, they can produce a max...
CrissCross Pivoting Rules
"... . Assuming that the reader is familiar with both the primal and dual simplex methods, Zionts' crisscross method can easily be explained. ffl It can be initialized by any, possibly both primal and dual infeasible basis . If the basis is optimal, we are done. If the basis is not optimal , th ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. Assuming that the reader is familiar with both the primal and dual simplex methods, Zionts' crisscross method can easily be explained. ffl It can be initialized by any, possibly both primal and dual infeasible basis . If the basis is optimal, we are done. If the basis is not optimal , then there are some primal or dual infeasible variables. One might choose any of these. It is advised to choose once a primal and then a dual infeasible variable, if possible. ffl If the selected variable is dual infeasible, then it enters the basis and the leaving variable is chosen among the primal feasible variables in such a way that primal feasibility of the currently primal feasible variables is preserved. If no such basis exchange is possible another infeasible variable is selected. ffl If the selected variable is primal infeasible, then it leaves the basis and the entering variable is chosen among th
The Finite CrissCross Method for Hyperbolic Programming
 INFORMATICA, TECHNISCHE UNIVERSITEIT DELFT, THE NETHERLANDS
, 1996
"... In this paper the finite crisscross method is generalized to solve hyperbolic programming problems. Just as in the case of linear or quadratic programming the crisscross method can be initialized with any, not necessarily feasible basic solution. Finiteness of the procedure is proved under the ..."
Abstract
 Add to MetaCart
In this paper the finite crisscross method is generalized to solve hyperbolic programming problems. Just as in the case of linear or quadratic programming the crisscross method can be initialized with any, not necessarily feasible basic solution. Finiteness of the procedure is proved under the usual mild assumptions. Some small numerical examples illustrate the main features of the algorithm.
Polynomial AffineScaling Algorithms for P*(k) Linear Complementarity Problems
, 1997
"... A family of primaldual affinescaling algorithms is presented for Linear Complementarity Problems (LCP's) with P*matrices. These algorithms were first introduced by Jansen et al. for solving linear optimization problems and later also applied to LCP's with positive semidefinite matrices. ..."
Abstract
 Add to MetaCart
A family of primaldual affinescaling algorithms is presented for Linear Complementarity Problems (LCP's) with P*matrices. These algorithms were first introduced by Jansen et al. for solving linear optimization problems and later also applied to LCP's with positive semidefinite matrices. We show that the same algorithmic concept applies to LCP's with P*matrices and that the resulting algorithms admit polynomialtime iteration bounds. and