Results 1  10
of
35
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Variable Metric Bundle Methods: from Conceptual to Implementable Forms
, 1996
"... To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, ..."
Abstract

Cited by 41 (8 self)
 Add to MetaCart
To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using &quot;reversal &quot; quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we incorporate a bundle strategy together with a "curvesearch". No convergence results are given for the implementable versions; however some numerical illustrations show their good behaviour even for largescale problems.
Optimal Power Generation under Uncertainty via Stochastic Programming
 in: Stochastic Programming Methods and Technical Applications (K. Marti and P. Kall Eds.), Lecture Notes in Economics and Mathematical Systems
, 1997
"... : A power generation system comprising thermal and pumpedstorage hydro plants is considered. Two kinds of models for the costoptimal generation of electric power under uncertain load are introduced: (i) a dynamic model for the shortterm operation and (ii) a power production planning model. In both ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
: A power generation system comprising thermal and pumpedstorage hydro plants is considered. Two kinds of models for the costoptimal generation of electric power under uncertain load are introduced: (i) a dynamic model for the shortterm operation and (ii) a power production planning model. In both cases, the presence of stochastic data in the optimization model leads to multistage and twostage stochastic programs, respectively. Both stochastic programming problems involve a large number of mixedinteger (stochastic) decisions, but their constraints are loosely coupled across operating power units. This is used to design Lagrangian relaxation methods for both models, which lead to a decomposition into stochastic single unit subproblems. For the dynamic model a Lagrangian decomposition based algorithm is described in more detail. Special emphasis is put on a discussion of the duality gap, the efficient solution of the multistage single unit subproblems and on solving the dual problem...
A Logarithmic Barrier Cutting Plane Method for Convex Programming
 Annals of Operations Research
, 1993
"... The paper presents a logarithmic barrier cutting plane algorithm for convex (possibly nonsmooth, semiinfinite) programming. Most cutting plane methods, like that of Kelley, and Cheney and Goldstein solve a linear approximation (localization) of the problem, and then generate an additional cut to r ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
The paper presents a logarithmic barrier cutting plane algorithm for convex (possibly nonsmooth, semiinfinite) programming. Most cutting plane methods, like that of Kelley, and Cheney and Goldstein solve a linear approximation (localization) of the problem, and then generate an additional cut to remove the linear program's optimal point. Other methods like the "central cutting" plane methods of ElzingaMoore and GoffinVial, calculate a center of the linear approximation and then adjust the level of the objective, or separate the current center from the feasible set. Contrary to these existing techniques, we develop a method which does not solve the linear relaxations till optimality, but rather stays in the interior of the feasible set. The iterates follow the central path of a linear relaxation, until the current iterate either leaves the feasible set or is too close to the boundary. When this occurs, a new cut is generated and the algorithm iterates. We use the tools developed b...
Global Optimization of Nonconvex Nonlinear Programs Using Parallel Branch and Bound
, 1995
"... A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
A branch and bound algorithm for computing globally optimal solutions to nonconvex nonlinear programs in continuous variables is presented. The algorithm is directly suitable for a wide class of problems arising in chemical engineering design. It can solve problems defined using algebraic functions and twice differentiable transcendental functions, in which finite upper and lower bounds can be placed on each variable. The algorithm uses rectangular partitions of the variable domain and a new bounding program based on convex/concave envelopes and positive definite combinations of quadratic terms. The algorithm is deterministic and obtains convergence with final regions of finite size. The partitioning strategy uses a sensitivity analysis of the bounding program to predict the best variable to split and the split location. Two versions of the algorithm are considered, the first using a local NLP algorithm (MINOS) and the second using a sequence of lower bounding programs in the search fo...
Logarithmic Barrier Decomposition Methods for SemiInfinite Programming
, 1996
"... A computational study of some logarithmic barrier decomposition algorithms for semiinfinite programming is presented in this paper. The conceptual algorithm is a straightforward adaptation of the logarithmic barrier cutting plane algorithm which was presented recently by den Hertog et al., to solv ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
A computational study of some logarithmic barrier decomposition algorithms for semiinfinite programming is presented in this paper. The conceptual algorithm is a straightforward adaptation of the logarithmic barrier cutting plane algorithm which was presented recently by den Hertog et al., to solve semiinfinite programming problems. Usually decomposition (cutting plane methods) use cutting planes to improve the localization of the given problem. In this paper we propose an extension which uses linear cuts to solve large scale, difficult real world problems. This algorithm uses both static and (doubly) dynamic enumeration of the parameter space and allows for multiple cuts to be simultaneously added for larger/difficult problems. The algorithm is implemented both on sequential and parallel computers. Implementation issues and parallelization strategies are discussed and encouraging computational results are presented. Keywords: column generation, convex programming, cutting plane met...
A VUalgorithm for convex minimization
 Mathematical Programming
, 2005
"... For convex minimization we introduce an algorithm based on VUspace decomposition. The method uses a bundle subroutine to generate a sequence of approximate proximal points. When a primaldual track leading to a solution and zero subgradient pair exists, these points approximate the primal track poi ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
For convex minimization we introduce an algorithm based on VUspace decomposition. The method uses a bundle subroutine to generate a sequence of approximate proximal points. When a primaldual track leading to a solution and zero subgradient pair exists, these points approximate the primal track points and give the algorithm’s V, or corrector, steps. The subroutine also approximates dual track points that are Ugradients needed for the method’s UNewton predictor steps. With the inclusion of a simple line search the resulting algorithm is proved to be globally convergent. The convergence is superlinear if the primaldual track points and the objective’s UHessian are approximated well enough. Keywords Convex minimization, proximal points, bundle methods, VUdecomposition, superlinear convergence. 1 Introduction and
A Class of Variable Metric Bundle Methods
 IFIP Proceedings, Syatems Modeling and Optimization
, 1993
"... : To minimize a convex function f , we state a class of penaltytype bundle algorithms, where the penalty uses a variable metric. This metric is updated according to quasiNewton formulae based on MoreauYosida approximations of f . In particular, we introduce a "reversal" quasiNewton for ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
: To minimize a convex function f , we state a class of penaltytype bundle algorithms, where the penalty uses a variable metric. This metric is updated according to quasiNewton formulae based on MoreauYosida approximations of f . In particular, we introduce a "reversal" quasiNewton formula, specially suited for our purpose. We consider several variants in the algorithm and discuss their respective merits. Furthermore, we accept a degenerate penalty term in the regularization. Keywords: Bundle methods, convex optimization, mathematical programming, proximal point, quasiNewton algorithms, variable metric. (R'esum'e : tsvp) Unite de recherche INRIA Rocquencourt Domaine de Voluceau, Rocquencourt, BP 105, 78153 LE CHESNAY Cedex (France) Telephone : (33 1) 39 63 55 11  Telecopie : (33 1) 39 63 53 30 Une classe de m'ethodes de faisceaux `a m'etrique variable R'esum'e : Pour minimiser une fonction convexe f , nous proposons une classe d'algorithmes de type faisceaux avec p'enalit...
Comparison of Bundle and Classical Column Generation
"... When a column generation approach is applied to decomposable mixed integer programming problems, it is standard to formulate and solve the master problem as a linear program. Seen in the dual space, this results in the algorithm known in the nonlinear programming community as the cuttingplane algor ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
When a column generation approach is applied to decomposable mixed integer programming problems, it is standard to formulate and solve the master problem as a linear program. Seen in the dual space, this results in the algorithm known in the nonlinear programming community as the cuttingplane algorithm of Kelley and CheneyGoldstein. However, more stable methods with better theoretical convergence rates are known and have been used as alternatives to this standard. One of them is the bundle method; our aim is to illustrate its differences with Kelley’s method. In the process we review alternative stabilization techniques used in column generation, comparing them from both primal and dual points of view. Numerical comparisons are presented for five applications: cutting stock (which includes bin packing), vertex coloring, capacitated vehicle routing, multiitem lot sizing, and traveling salesman. We also give a sketchy comparison with the volume algorithm.
Strong convergence of blockiterative outer approximation methods for convex optimization
 SIAM J. Control Optim
, 1999
"... Abstract. The strong convergence of a broad class of outer approximation methods for minimizing a convex function over the intersection of an arbitrary number of convex sets in a reflexive Banach space is studied in a unified framework. The generic outer approximation algorithm under investigation p ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract. The strong convergence of a broad class of outer approximation methods for minimizing a convex function over the intersection of an arbitrary number of convex sets in a reflexive Banach space is studied in a unified framework. The generic outer approximation algorithm under investigation proceeds by successive minimizations over the intersection of convex supersets of the feasibility set determined in terms of the current iterate and variable blocks of constraints. The convergence analysis involves flexible constraint approximation and aggregation techniques as well as relatively mild assumptions on the constituents of the problem. Various wellknown schemes are recovered as special realizations of the generic algorithm and parallel blockiterative extensions of these schemes are devised within the proposed framework. The case of inconsistent constraints is also considered.