Results 1  10
of
72
Lectures on modern convex optimization
 Society for Industrial and Applied Mathematics (SIAM
, 2001
"... Mathematical Programming deals with optimization programs of the form and includes the following general areas: minimize f(x) subject to gi(x) ≤ 0, i = 1,..., m, [x ⊂ R n] 1. Modelling: methodologies for posing various applied problems as optimization programs; 2. Optimization Theory, focusing on e ..."
Abstract

Cited by 96 (7 self)
 Add to MetaCart
Mathematical Programming deals with optimization programs of the form and includes the following general areas: minimize f(x) subject to gi(x) ≤ 0, i = 1,..., m, [x ⊂ R n] 1. Modelling: methodologies for posing various applied problems as optimization programs; 2. Optimization Theory, focusing on existence, uniqueness and on characterization of optimal solutions to optimization programs; 3. Optimization Methods: development and analysis of computational algorithms for various classes of optimization programs; 4. Implementation, testing and application of modelling methodologies and computational algorithms. Essentially, Mathematical Programming was born in 1948, when George Dantzig has invented Linear Programming – the class of optimization programs (P) with linear objective f(·) and
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 60 (2 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Multiple Cuts in the Analytic Center Cutting Plane Method
, 1998
"... We analyze the multiple cut generation scheme in the analytic center cutting plane method. We propose an optimal primal and dual updating direction when the cuts are central. The direction is optimal in the sense that it maximizes the product of the new dual slacks and of the new primal variables wi ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
We analyze the multiple cut generation scheme in the analytic center cutting plane method. We propose an optimal primal and dual updating direction when the cuts are central. The direction is optimal in the sense that it maximizes the product of the new dual slacks and of the new primal variables within the trust regions defined by Dikin's primal and dual ellipsoids. The new primal and dual directions use the variancecovariance matrix of the normals to the new cuts in the metric given by Dikin's ellipsoid. We prove that the recovery of a new analytic center from the optimal restoration direction can be done in O(p log(p + 1)) damped Newton steps, where p is the number of new cuts added by the oracle, which may vary with the iteration. The results and the proofs are independent of the specific scaling matrix primal, dual or primaldual that is used in the computations. The computation of the optimal direction uses Newton's method applied to a selfconcordant function of p variab...
Sensitivity analysis in linear programming and semidefinite programming using interiorpoint methods
 Cornell University
, 1999
"... We analyze perturbations of the righthand side and the cost parameters in linear programming (LP) and semidefinite programming (SDP). We obtain tight bounds on the norm of the perturbations that allow interiorpoint methods to recover feasible and nearoptimal solutions in a single interiorpoint i ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
We analyze perturbations of the righthand side and the cost parameters in linear programming (LP) and semidefinite programming (SDP). We obtain tight bounds on the norm of the perturbations that allow interiorpoint methods to recover feasible and nearoptimal solutions in a single interiorpoint iteration. For the unique, nondegenerate solution case in LP, we show that the bounds obtained using interiorpoint methods compare nicely with the bounds arising from the simplex method. We also present explicit bounds for SDP using the AHO, H..K..M, and NT directions.
New Complexity Analysis of the PrimalDual Newton Method for Linear Optimization
, 1998
"... We deal with the primaldual Newton method for linear optimization (LO). Nowadays, this method is the working horse in all efficient interior point algorithms for LO, and its analysis is the basic element in all polynomiality proofs of such algorithms. At present there is still a gap between the pra ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
We deal with the primaldual Newton method for linear optimization (LO). Nowadays, this method is the working horse in all efficient interior point algorithms for LO, and its analysis is the basic element in all polynomiality proofs of such algorithms. At present there is still a gap between the practical behavior of the algorithms and the theoretical performance results, in favor of the practical behavior. This is especially true for socalled largeupdate methods. We present some new analysis tools, based on a proximity measure introduced by Jansen et al., in 1994, that may help to close this gap. This proximity measure has not been used in the analysis of largeupdate method before. Our new analysis not only provides a unified way for the analysis of both largeupdate and smallupdate methods, but also improves the known iteration bounds.
A New Class of Polynomial PrimalDual Methods for Linear and Semidefinite Optimization
, 1999
"... We propose a new class of primaldual methods for linear optimization (LO). By using some new analysis tools, we prove that the large update method for LO based on the new search direction has a polynomial complexity O i n 4 4+ae log n " j iterations where ae 2 [0; 2] is a parameter used ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
We propose a new class of primaldual methods for linear optimization (LO). By using some new analysis tools, we prove that the large update method for LO based on the new search direction has a polynomial complexity O i n 4 4+ae log n " j iterations where ae 2 [0; 2] is a parameter used in the system defining the search direction. If ae = 0, our results reproduce the well known complexity of the standard primal dual Newton method for LO. At each iteration, our algorithm needs only to solve a linear equation system. An extension of the algorithms to semidefinite optimization is also presented. Keywords: Linear Optimization, Semidefinite Optimization, Interior Point Method, PrimalDual Newton Method, Polynomial Complexity. AMS Subject Classification: 90C05 1 Introduction Interior point methods (IPMs) are among the most effective methods for solving wide classes of optimization problems. Since the seminal work of Karmarkar [7], many researchers have proposed and analyzed various ...
How good are interior point methods? KleeMinty. cubes tighten iterationcomplexity bounds
, 2004
"... ..."
Selfregular proximities and new search directions for linear and semidefinite optimization
 Mathematical Programming
, 2000
"... In this paper, we first introduce the notion of selfregular functions. Various appealing properties of selfregular functions are explored and we also discuss the relation between selfregular functions and the wellknown selfconcordant functions. Then we use such functions to define selfregular p ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
In this paper, we first introduce the notion of selfregular functions. Various appealing properties of selfregular functions are explored and we also discuss the relation between selfregular functions and the wellknown selfconcordant functions. Then we use such functions to define selfregular proximity measure for pathfollowing interior point methods for solving linear optimization (LO) problems. Any selfregular proximity measure naturally defines a primaldual search direction. In this way a new class of primaldual search directions for solving LO problems is obtained. Using the appealing properties of selfregular functions, we prove that these new largeupdate pathfollowing methods for LO enjoy a polynomial, O n q+1 2q log n iteration bound, where q ≥ 1 is the socalled barrier degree of the selfregular ε proximity measure underlying the algorithm. When q increases, this � bound approaches the √n n best known complexity bound for interior point methods, namely O log. Our unified �√n ε n analysis provides also the O log best known iteration bound of smallupdate IPMs. ε At each iteration, we need only to solve one linear system. As a byproduct of our results, we remove some limitations of the algorithms presented in [24] and improve their complexity as well. An extension of these results to semidefinite optimization (SDO) is also discussed.
Qsuperlinear convergence of the iterates in primaldual interiorpoint methods
 MATH. PROGRAM., SER. A 91: 99–115 (2001)
, 2001
"... ..."