Results 1  10
of
11
Solving RealWorld Linear Ordering Problems . . .
, 1995
"... Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear prog ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
Cutting plane methods require the solution of a sequence of linear programs, where the solution to one provides a warm start to the next. A cutting plane algorithm for solving the linear ordering problem is described. This algorithm uses the primaldual interior point method to solve the linear programming relaxations. A point which is a good warm start for a simplexbased cutting plane algorithm is generally not a good starting point for an interior point method. Techniques used to improve the warm start include attempting to identify cutting planes early and storing an old feasible point, which is used to help recenter when cutting planes are added. Computational results are described for some realworld problems; the algorithm appears to be competitive with a simplexbased cutting plane algorithm.
Polynomial interior point cutting plane methods
 Optimization Methods and Software
, 2003
"... Polynomial cutting plane methods based on the logarithmic barrier function and on the volumetric center are surveyed. These algorithms construct a linear programming relaxation of the feasible region, find an appropriate approximate center of the region, and call a separation oracle at this approxim ..."
Abstract

Cited by 25 (9 self)
 Add to MetaCart
(Show Context)
Polynomial cutting plane methods based on the logarithmic barrier function and on the volumetric center are surveyed. These algorithms construct a linear programming relaxation of the feasible region, find an appropriate approximate center of the region, and call a separation oracle at this approximate center to determine whether additional constraints should be added to the relaxation. Typically, these cutting plane methods can be developed so as to exhibit polynomial convergence. The volumetric cutting plane algorithm achieves the theoretical minimum number of calls to a separation oracle. Longstep versions of the algorithms for solving convex optimization problems are presented. 1
INTERIOR POINT METHODS FOR COMBINATORIAL OPTIMIZATION
, 1995
"... Research on using interior point algorithms to solve combinatorial optimization and integer programming problems is surveyed. This paper discusses branch and cut methods for integer programming problems, a potential reduction method based on transforming an integer programming problem to an equivale ..."
Abstract

Cited by 16 (9 self)
 Add to MetaCart
(Show Context)
Research on using interior point algorithms to solve combinatorial optimization and integer programming problems is surveyed. This paper discusses branch and cut methods for integer programming problems, a potential reduction method based on transforming an integer programming problem to an equivalent nonconvex quadratic programming problem, interior point methods for solving network flow problems, and methods for solving multicommodity flow problems, including an interior point column generation algorithm.
Column Generation with a PrimalDual Method
, 1997
"... A simple column generation scheme that employs an interior point method to solve underlying restricted master problems is presented. In contrast with the classical column generation approach where restricted master problems are solved exactly, the method presented in this paper consists in solving i ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
A simple column generation scheme that employs an interior point method to solve underlying restricted master problems is presented. In contrast with the classical column generation approach where restricted master problems are solved exactly, the method presented in this paper consists in solving it to a predetermined optimality tolerance (loose at the beginning and appropriately tightened when the optimum is approached). An infeasible primaldual interior point method which employs the notion of ¯center to control the distance to optimality is used to solve the restricted master problem. Similarly to the analytic center cutting plane method, the present approach takes full advantage of the use of central prices. Furthermore, it offers more freedom in the choice of optimization strategy as it adaptively adjusts the required optimality tolerance in the master to the observed rate of convergence of the column generation process. The proposed method has been implemented and used to solv...
Constraint reduction for linear programs with many inequality constraints
 SIAM Journal on Optimization
, 2006
"... Consider solving a linear program in standard form, where the constraint matrix A is m × n, with n ≫ m ≫ 1. Such problems arise, for example, as the result of finely discretizing a semiinfinite program. The cost per iteration of typical primaldual interiorpoint methods on such problems is O(m 2 n) ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Consider solving a linear program in standard form, where the constraint matrix A is m × n, with n ≫ m ≫ 1. Such problems arise, for example, as the result of finely discretizing a semiinfinite program. The cost per iteration of typical primaldual interiorpoint methods on such problems is O(m 2 n). We propose to reduce that
Logarithmic Barrier Decomposition Methods for SemiInfinite Programming
, 1996
"... A computational study of some logarithmic barrier decomposition algorithms for semiinfinite programming is presented in this paper. The conceptual algorithm is a straightforward adaptation of the logarithmic barrier cutting plane algorithm which was presented recently by den Hertog et al., to solv ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
A computational study of some logarithmic barrier decomposition algorithms for semiinfinite programming is presented in this paper. The conceptual algorithm is a straightforward adaptation of the logarithmic barrier cutting plane algorithm which was presented recently by den Hertog et al., to solve semiinfinite programming problems. Usually decomposition (cutting plane methods) use cutting planes to improve the localization of the given problem. In this paper we propose an extension which uses linear cuts to solve large scale, difficult real world problems. This algorithm uses both static and (doubly) dynamic enumeration of the parameter space and allows for multiple cuts to be simultaneously added for larger/difficult problems. The algorithm is implemented both on sequential and parallel computers. Implementation issues and parallelization strategies are discussed and encouraging computational results are presented. Keywords: column generation, convex programming, cutting plane met...
A LongStep, Cutting Plane Algorithm for Linear and Convex Programming
, 2000
"... A cutting plane method for linear programming is described. This method is an extension of Atkinson and Vaidya’s algorithm, and uses the central trajectory. The logarithmic barrier function is used explicitly, motivated partly by the successful implementation of such algorithms. This makes it possib ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
A cutting plane method for linear programming is described. This method is an extension of Atkinson and Vaidya’s algorithm, and uses the central trajectory. The logarithmic barrier function is used explicitly, motivated partly by the successful implementation of such algorithms. This makes it possible to maintain primal and dual iterates, thus allowing termination at will, instead of having to solve to completion. This algorithm has the same complexity (O(nL2) iterations) as Atkinson and Vaidya’s algorithm, but improves upon it in that it is a ‘longstep’ version, while theirs is a ‘shortstep ’ one in some sense. For this reason, this algorithm is computationally much more promising as well. This algorithm can be of use in solving combinatorial optimization problems with large numbers of constraints, such as the Traveling Salesman Problem.
Interior Point Algorithms for Integer Programming
, 1994
"... Research on using interior point algorithms to solve integer programming problems is surveyed. This paper concentrates on branch and bound and cutting plane methods; a potential function method is also briefly mentioned. The principal difficulty with using an interior point algorithm in a branch and ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
Research on using interior point algorithms to solve integer programming problems is surveyed. This paper concentrates on branch and bound and cutting plane methods; a potential function method is also briefly mentioned. The principal difficulty with using an interior point algorithm in a branch and cut method to solve integer programming problems is in warm starting the algorithm efficiently. Methods for overcoming this difficulty are described and other features of the algorithms are given. This paper focuses on the techniques necessary to obtain an efficient computational implementation; there is a short discussion of theoretical issues.
Adaptive constraint reduction for training support vector machines
, 2007
"... A support vector machine (SVM) determines whether a given observed pattern lies in a particular class. The decision is based on prior training of the SVM on a set of patterns with known classification, and training is achieved by solving a convex quadratic programming problem. Since there are typica ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
A support vector machine (SVM) determines whether a given observed pattern lies in a particular class. The decision is based on prior training of the SVM on a set of patterns with known classification, and training is achieved by solving a convex quadratic programming problem. Since there are typically a large number of training patterns, this can be expensive. In this work, we propose an adaptive constraint reduction primaldual interiorpoint method for training a linear SVM with ℓ1 penalty (hinge loss) for misclassification. We reduce the computational effort by assembling the normal equation matrix using only a wellchosen subset of patterns. Starting with a large portion of the patterns, our algorithm excludes more and more unnecessary patterns as the iteration proceeds. We extend our approach to training nonlinear SVMs through Gram matrix approximation methods. We demonstrate the effectiveness of the algorithm on a variety of standard test problems.
Adaptive Constraint Reduction for Convex Quadratic Programming and Training Support Vector Machines
, 2008
"... Convex quadratic programming (CQP) is an optimization problem of minimizing a convex quadratic objective function subject to linear constraints. We propose an adaptive constraint reduction primaldual interiorpoint algorithm for convex quadratic programming with many more constraints than variables ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Convex quadratic programming (CQP) is an optimization problem of minimizing a convex quadratic objective function subject to linear constraints. We propose an adaptive constraint reduction primaldual interiorpoint algorithm for convex quadratic programming with many more constraints than variables. We reduce the computational effort by assembling the normal equation matrix with a subset of the constraints. Instead of the exact matrix, we compute an approximate matrix for a well chosen index set which includes indices of constraints that seem to be most critical. Starting with a large portion of the constraints, our proposed scheme excludes more unnecessary constraints at later iterations. We provide proofs for the global convergence and the quadratic local convergence rate of an affine scaling variant. A similar approach can be applied to Mehrotra’s predictorcorrector type algorithms. An example of CQP arises in training a linear support vector machine (SVM), which is a popular tool for pattern recognition. The difficulty in training a supportvector machine (SVM) lies in the typically vast number of patterns used for the training process. In this work, we propose an adaptive constraint reduction primaldual interiorpoint method for training the linear SVM with l1 hinge loss. We reduce the computational effort by assembling the normal equation matrix with a subset of wellchosen patterns. Starting with a large portion of the patterns, our proposed scheme excludes more and more unnecessary patterns as the iteration proceeds. We extend our approach to training nonlinear SVMs through Gram matrix approximation methods. Promising numerical results are reported.