Results 1  10
of
36
Interiorpoint Methods
, 2000
"... The modern era of interiorpoint methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadrati ..."
Abstract

Cited by 566 (15 self)
 Add to MetaCart
(Show Context)
The modern era of interiorpoint methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadratic programming, semidefinite programming, and nonconvex and nonlinear problems, have reached varying levels of maturity. We review some of the key developments in the area, including comments on both the complexity theory and practical algorithms for linear programming, semidefinite programming, monotone linear complementarity, and convex programming over sets that can be characterized by selfconcordant barrier functions.
Linear Programming, Complexity Theory and Elementary Functional Analysis
 Mathematical Programming
, 1995
"... This paper was conceived in part while the author was sponsored by the visiting scientist program at the IBM T.J. Watson Research Center. Special thanks to Mike Shub, Roy Adler and Shmuel Winograd for their generosity. 1 Introduction ..."
Abstract

Cited by 101 (1 self)
 Add to MetaCart
This paper was conceived in part while the author was sponsored by the visiting scientist program at the IBM T.J. Watson Research Center. Special thanks to Mike Shub, Roy Adler and Shmuel Winograd for their generosity. 1 Introduction
Two Numerical Methods for Optimizing Matrix Stability
 Linear Algebra Appl
, 2001
"... Consider the ane matrix family A(x) = A 0 + k=1 x k A k , mapping a design vector x 2 R into the space of n n real matrices. ..."
Abstract

Cited by 31 (8 self)
 Add to MetaCart
Consider the ane matrix family A(x) = A 0 + k=1 x k A k , mapping a design vector x 2 R into the space of n n real matrices.
TOMLAB  An Environment for Solving Optimization Problems in MATLAB
 PROCEEDINGS FOR THE NORDIC MATLAB CONFERENCE '97
, 1997
"... TOMLAB is a general purpose, open and integrated MATLAB environment for solving optimization problems on UNIX and PC systems. TOMLAB has meny systems and driver routines for the most common optimization problems and more than 50 algorithms implemented in the toolbox NLPLIB and the toolbox OPERA. N ..."
Abstract

Cited by 16 (12 self)
 Add to MetaCart
TOMLAB is a general purpose, open and integrated MATLAB environment for solving optimization problems on UNIX and PC systems. TOMLAB has meny systems and driver routines for the most common optimization problems and more than 50 algorithms implemented in the toolbox NLPLIB and the toolbox OPERA. NLPLIB TB 1.0 is a MATLAB toolbox for nonlinear programming and parameter estimation and OPERA TB 1.0 is a MATLAB toolbox for operational research, with emphasis on linear and discrete optimization. Of special interest in NLPLIB TB 1.0 are the algorithms for general and separable nonlinear least squares parameter estimation. TOMLAB is using MEXfile interfaces to call solvers written in C/C++ and FORTRAN. Currently MEXfile interfaces have been developed for the commercial solvers MINOS, NPSOL, NPOPT, NLSSOL, LPOPT, QPOPT and LSSOL. From TOMLAB it is also possible to call routines in the MathWorks Optimization Toolbox. Interfaces are available for the model language AMPL and the CUTE (Cons...
TOMLAB  A General Purpose, Open MATLAB Environment for Research and Teaching in Optimization
, 1998
"... TOMLAB is a general purpose, open and integrated MATLAB environment for research and teaching in optimization on UNIX and PC systems. The motivation for TOMLAB is to simplify research on practical optimization problems, giving easy access to all types of solvers; at the same time having full acce ..."
Abstract

Cited by 14 (13 self)
 Add to MetaCart
TOMLAB is a general purpose, open and integrated MATLAB environment for research and teaching in optimization on UNIX and PC systems. The motivation for TOMLAB is to simplify research on practical optimization problems, giving easy access to all types of solvers; at the same time having full access to the power of MATLAB. By using a simple, but general input format, combined with the ability in MATLAB to evaluate string expressions, it is possible to run internal TOMLAB solvers, MATLAB Optimization Toolbox and commercial solvers written in FORTRAN or C/C++ using MEXfile interfaces. Currently MEXfile interfaces have been developed for MINOS, NPSOL, NPOPT, NLSSOL, LPOPT, QPOPT and LSSOL. TOMLAB may either be used totally parameter driven or menu driven. The basic principles will be discussed. The menu system makes it suitable for teaching. Many standard test problems are included. More test problems are easily added. There are many example and demonstration files. Iterati...
More complete gene silencing by fewer siRNAs: transparent optimized design and biophysical signature
, 2006
"... ..."
Strong Duality and Minimal Representations for Cone Optimization
, 2008
"... The elegant results for strong duality and strict complementarity for linear programming, LP, can fail for cone programming over nonpolyhedral cones. One can have: unattained optimal values; nonzero duality gaps; and no primaldual optimal pair that satisfies strict complementarity. This failure is ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
The elegant results for strong duality and strict complementarity for linear programming, LP, can fail for cone programming over nonpolyhedral cones. One can have: unattained optimal values; nonzero duality gaps; and no primaldual optimal pair that satisfies strict complementarity. This failure is tied to the nonclosure of sums of nonpolyhedral closed cones. We take a fresh look at known and new results for duality, optimality, constraint qualifications, and strict complementarity, for linear cone optimization problems in finite dimensions. These results include: weakest and universal constraint qualifications, CQs; duality and characterizations of optimality that hold without any CQ; geometry of nice and devious cones; the geometric relationships between zero duality gaps, strict complementarity, and the facial structure of cones; and, the connection between theory and empirical evidence for lack of a CQand failure of strict complementarity. One theme is the notion of minimal representation of the cone and the constraints in order to regularize the problem and avoid both the theoretical and numerical difficulties that arise due to (near) loss of a CQ. We include a discussion on obtaining these representations efficiently.
Interior point methods 25 years later
 European Journal of Operational Research
"... accepted for publication in European Journal of Operational Research Interior point methods for optimization have been around for more than 25 years now. Their presence has shaken up the field of optimization. Interior point methods for linear and (convex) quadratic programming display several featu ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
accepted for publication in European Journal of Operational Research Interior point methods for optimization have been around for more than 25 years now. Their presence has shaken up the field of optimization. Interior point methods for linear and (convex) quadratic programming display several features which make them particularly attractive for very large scale optimization. Among the most impressive of them are their lowdegree polynomial worstcase complexity and an unrivalled ability to deliver optimal solutions in an almost constant number of iterations which depends very little, if at all, on the problem dimension. Interior point methods are competitive when dealing with small problems of dimensions below one million constraints and variables and are beyond competition when applied to large problems of dimensions going into millions of constraints and variables. In this survey we will discuss several issues related to interior point methods including the proof of the worstcase complexity result, the reasons for their amazingly fast practical convergence and the features responsible for their ability to solve very large problems. The evergrowing sizes of optimization problems impose new requirements on optimization methods and software. In the final part of this paper we will therefore address a redesign of interior point methods to allow them to work in a matrixfree regime and to make them wellsuited to solving even larger problems.
LargeScale Nonlinear Constrained Optimization: A Current Survey
, 1994
"... . Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithm ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
. Much progress has been made in constrained nonlinear optimization in the past ten years, but most largescale problems still represent a considerable obstacle. In this survey paper we will attempt to give an overview of the current approaches, including interior and exterior methods and algorithms based upon trust regions and line searches. In addition, the importance of software, numerical linear algebra and testing will be addressed. We will try to explain why the difficulties arise, how attempts are being made to overcome them and some of the problems that still remain. Although there will be some emphasis on the LANCELOT and CUTE projects, the intention is to give a broad picture of the stateoftheart. 1 IBM T.J. Watson Research Center, P.O.Box 218, Yorktown Heights, NY 10598, USA 2 Parallel Algorithms Team, CERFACS, 42 Ave. G. Coriolis, 31057 Toulouse Cedex, France 3 Central Computing Department, Rutherford Appleton Laboratory, Chilton, Oxfordshire, OX11 0QX, England ...