Results 1  10
of
22
Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones
, 1998
"... SeDuMi is an addon for MATLAB, that lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This pape ..."
Abstract

Cited by 1334 (4 self)
 Add to MetaCart
SeDuMi is an addon for MATLAB, that lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This paper describes how to work with this toolbox.
A simplified homogeneous and selfdual linear programming algorithm and its implementation
 Annals of Operations Research
, 1996
"... 1 Introduction Consider the linear programming (LP) problem in the standard form: (LP) minimize cT x ..."
Abstract

Cited by 63 (5 self)
 Add to MetaCart
1 Introduction Consider the linear programming (LP) problem in the standard form: (LP) minimize cT x
Using SeDuMi 1.0x , A Matlab TOOLBOX FOR OPTIMIZATION OVER SYMMETRIC CONES
, 1999
"... SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This p ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This paper describes how to work with this toolbox.
A Path to the ArrowDebreu Competitive Market Equilibrium
 MATH. PROGRAMMING
, 2004
"... We present polynomialtime interiorpoint algorithms for solving the Fisher and ArrowDebreu competitive market equilibrium problems with linear utilities and n players. Both of them have the arithmetic operation complexity bound of O(n 4 log(1/ɛ)) for computing an ɛequilibrium solution. If the p ..."
Abstract

Cited by 42 (7 self)
 Add to MetaCart
(Show Context)
We present polynomialtime interiorpoint algorithms for solving the Fisher and ArrowDebreu competitive market equilibrium problems with linear utilities and n players. Both of them have the arithmetic operation complexity bound of O(n 4 log(1/ɛ)) for computing an ɛequilibrium solution. If the problem data are rational numbers and their bitlength is L, then the bound to generate an exact solution is O(n 4 L) which is in line with the best complexity bound for linear programming of the same dimension and size. This is a significant improvement over the previously best bound O(n 8 log(1/ɛ)) for approximating the two problems using other methods. The key ingredient to derive these results is to show that these problems admit convex optimization formulations, efficient barrier functions and fast rounding techniques. We also present a continuous path leading to the set of the ArrowDebreu equilibrium, similar to the central path developed for linear programming interiorpoint methods. This path is derived from the weighted logarithmic utility and barrier functions and the Brouwer fixedpoint theorem. The defining equations are bilinear and possess some primaldual structure for the application of the Newtonbased pathfollowing method.
Combining InteriorPoint and Pivoting Algorithms for Linear Programming
 Management Science
, 1996
"... ..."
Probabilistic Analysis of an InfeasibleInteriorPoint Algorithm for Linear Programming
, 1998
"... We consider an infeasibleinteriorpoint algorithm, endowed with a finite termination scheme, applied to random linear programs generated according to a model of Todd. Such problems have degenerate optimal solutions, and possess no feasible starting point. We use no information regarding an optimal ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
We consider an infeasibleinteriorpoint algorithm, endowed with a finite termination scheme, applied to random linear programs generated according to a model of Todd. Such problems have degenerate optimal solutions, and possess no feasible starting point. We use no information regarding an optimal solution in the initialization of the algorithm. Our main result is that the expected number of iterations before termination with an exact optimal solution is O(n ln(n)). Keywords: Linear Programming, AverageCase Behavior, InfeasibleInteriorPoint Algorithm. Running Title: Probabilistic Analysis of an LP Algorithm 1 Dept. of Management Sciences, University of Iowa. Supported by an Interdisciplinary Research Grant from the Center for Advanced Studies, University of Iowa. 2 Dept. of Mathematics, Valdosta State University. Supported by an Interdisciplinary Research Grant from the Center for Advanced Studies, University of Iowa. 3 Dept. of Mathematics, University of Iowa. Supported by ...
A new iterationcomplexity bound for the MTY predictorcorrector algorithm
 SIAM Journal on Optimization
"... Abstract. In this paper we present a new iterationcomplexity bound for the Mizuno–Todd–Ye predictorcorrector (MTY PC) primaldual interiorpoint algorithm for linear programming. The analysis of the paper is based on the important notion of crossover events introduced by Vavasis and Ye. For a sta ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we present a new iterationcomplexity bound for the Mizuno–Todd–Ye predictorcorrector (MTY PC) primaldual interiorpoint algorithm for linear programming. The analysis of the paper is based on the important notion of crossover events introduced by Vavasis and Ye. For a standard form linear program min{cT x: Ax = b, x ≥ 0} with decision variable x ∈ n, we show that the MTY PC algorithm, started from a wellcentered interiorfeasible solution with duality gap nµ0, finds an interiorfeasible solution with duality gap less than nη in O(T (µ0/η)+n3.5 log(χ̄∗A)) iterations, where T (t) ≡ min{n2 log(log t), log t} for all t> 0 and χ̄∗A is a scaling invariant condition number associated with the matrix A. More specifically, χ̄∗A is the infimum of all the conditions numbers χ̄AD, where D varies over the set of positive diagonal matrices. Under the setting of the Turing machine model, our analysis yields an O(n3.5LA + min{n2 logL,L}) iterationcomplexity bound for the MTY PC algorithm to find a primaldual optimal solution, where LA and L are the input sizes of the matrix A and the data (A, b, c), respectively. This contrasts well with the classical iterationcomplexity bound for the MTY PC algorithm, which depends linearly on L instead of logL.
On generalized branching methods for mixed integer programming
, 2004
"... In this paper we present a restructuring of the computations in Lenstra’s methods for solving mixed integer linear programs. We show that the problem of finding a good branching hyperplane can be formulated on an adjoint lattice of the Kernel lattice of the equality constraints without requiring any ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
In this paper we present a restructuring of the computations in Lenstra’s methods for solving mixed integer linear programs. We show that the problem of finding a good branching hyperplane can be formulated on an adjoint lattice of the Kernel lattice of the equality constraints without requiring any dimension reduction. As a consequence the short lattice vector finding algorithms, such as Lenstra, Lenstra, Lovász (LLL) [15] or the generalized basis reduction algorithm of Lovász and Scarf [18] are described in the space of original variables. Based on these results we give a new natural heuristic way of generating branching hyperplanes, and discuss its relationship with recent reformulation techniques of Aardal and Lenstra [1]. We show that the reduced basis available at the root node has useful information on the branching hyperplanes for the generalized branchandbound tree. Based on these results algorithms are also given for solving mixed convex integer programs.
A cutting surface method for uncertain linear programs with polyhedral stochastic dominance constraints
 SIAM Journal on Optimization
"... In this paper we study linear optimization problems with multidimensional linear positive secondorder stochastic dominance constraints. By using the polyhedral properties of the secondorder linear dominance condition we present a cuttingsurface algorithm, and show its finite convergence. The cut ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
In this paper we study linear optimization problems with multidimensional linear positive secondorder stochastic dominance constraints. By using the polyhedral properties of the secondorder linear dominance condition we present a cuttingsurface algorithm, and show its finite convergence. The cut generation problem is a difference of convex functions (DC) optimization problem. We exploit the polyhedral structure of this problem to present a novel branchandcut algorithm that incorporates concepts from concave minimization and binary integer programming. A linear programming problem is formulated for generating concavity cuts in our case, where the polyhedra is unbounded. We also present duality results for this problem relating the dual multipliers to utility functions, without the need to impose constraint qualifications, which again is possible because of the polyhedral nature of the problem. Numerical examples are presented showing the nature of solutions of our model.
On Free Variables In Interior Point Methods
, 1997
"... this paper wehave selected the primaldual logarithmic barrier algorithm to present our ideas, because it and its modified versions are considered, in general, to be the most efficient in practice. The computational results presented in this paper were obtained using implementations of this algorith ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
this paper wehave selected the primaldual logarithmic barrier algorithm to present our ideas, because it and its modified versions are considered, in general, to be the most efficient in practice. The computational results presented in this paper were obtained using implementations of this algorithm. It is to be noted, however, that this choice has notational consequences only. Practically,anyinterior point method, even nonlinear ones can be discussed in a similar linear algebra framework. Let us consider the linear programming problem