Results 1  10
of
80
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
A Cutting Plane Method from Analytic Centers for Stochastic Programming
 Mathematical Programming
, 1994
"... The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decomposition ..."
Abstract

Cited by 49 (18 self)
 Add to MetaCart
The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decomposition the two approaches are in fact identical by duality. Here we shall investigate the use of the method of cutting planes from analytic centers applied to similar formulations. The only significant difference form the aforementioned methods is that new cutting planes (or columns, by duality) will be generated not from the optimum of the linear programming relaxation, but from the analytic center of the set of localization. 1 Introduction The study of optimization problems in the presence of uncertainty still taxes the limits of methodology and software. One of the most approachable settings is that of twostaged planning under uncertainty, in which a first stage decision has to be taken bef...
Variable Metric Bundle Methods: from Conceptual to Implementable Forms
, 1996
"... To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we inco ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we incorporate a bundle strategy together with a "curvesearch". No convergence results are given for the implementable versions; however some numerical illustrations show their good behaviour even for largescale problems.
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
ACCPM  A Library for Convex Optimization Based on an Analytic Center Cutting Plane Method
 European Journal of Operational Research
, 1996
"... Introduction We are concerned in this note with the Goffin Haurie and Vial's [7] Analytic Center Cutting Plane Method (ACCPM for short) for largescale convex optimization. Its stateoftheart implementation [10] is now available upon request for academic research use. Cutting plane methods for co ..."
Abstract

Cited by 33 (17 self)
 Add to MetaCart
Introduction We are concerned in this note with the Goffin Haurie and Vial's [7] Analytic Center Cutting Plane Method (ACCPM for short) for largescale convex optimization. Its stateoftheart implementation [10] is now available upon request for academic research use. Cutting plane methods for convex optimization have a long history that goes back at least to a fundamental paper of Kelley [14]. There exist numerous strategies that can be applied to "solve" subsequent relaxed master problems in the cutting planes optimization scheme. In the Analytic Center Cutting Plane Method, subsequent relaxed master problems are not solved to optimality. Instead of it, an approximate analytic center of the current localization set is looked for. The theoretical development of ACCPM started from Goffin and Vial [9]. It was later continued in [7, 8] and led to a development of the prototype implementation of the method due to du Merle [15] that was successfully applied to solve several nont
Solving Nonlinear Multicommodity Flow Problems By The Analytic Center Cutting Plane Method
, 1995
"... The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear prog ..."
Abstract

Cited by 29 (14 self)
 Add to MetaCart
The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear programming problems. Each subproblem consists of finding a minimum cost flow between an origin and a destination node in an uncapacited network. It is thus formulated as a shortest path problem and solved with the Dijkstra's dheap algorithm. An implementation is described that that takes full advantage of the supersparsity of the network in the linear algebra operations. Computational results show the efficiency of this approach on wellknown nondifferentiable problems and also large scale randomly generated problems (up to 1000 arcs and 5000 commodities). This research has been supported by the Fonds National de la Recherche Scientifique Suisse, grant #12 \Gamma 34002:92, NSERCCanada and ...
Reoptimization with the PrimalDual Interior Point Method
, 2001
"... Reoptimization techniques for an interior point method applied to solve a sequence of linear programming problems are discussed. Conditions are given for problem perturbations that can be absorbed in merely one Newton step. The analysis is performed for both shortstep and longstep feasible pathf ..."
Abstract

Cited by 25 (10 self)
 Add to MetaCart
Reoptimization techniques for an interior point method applied to solve a sequence of linear programming problems are discussed. Conditions are given for problem perturbations that can be absorbed in merely one Newton step. The analysis is performed for both shortstep and longstep feasible pathfollowing method. A practical procedure is then derived for an infeasible pathfollowing method. It is applied in the context of crash start for several largescale structured linear programs. Numerical results with OOPS, the new objectoriented parallel solver demonstrate the efficiency of the approach. For large structured linear programs crash start leads to about 40% reduction of the iterations number and translates into 25% reduction of the solution time. The crash procedure parallelizes well and speedups between 3.13.8 on 4 processors are achieved.
Optimal Power Generation under Uncertainty via Stochastic Programming
 in: Stochastic Programming Methods and Technical Applications (K. Marti and P. Kall Eds.), Lecture Notes in Economics and Mathematical Systems
, 1997
"... : A power generation system comprising thermal and pumpedstorage hydro plants is considered. Two kinds of models for the costoptimal generation of electric power under uncertain load are introduced: (i) a dynamic model for the shortterm operation and (ii) a power production planning model. In both ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
: A power generation system comprising thermal and pumpedstorage hydro plants is considered. Two kinds of models for the costoptimal generation of electric power under uncertain load are introduced: (i) a dynamic model for the shortterm operation and (ii) a power production planning model. In both cases, the presence of stochastic data in the optimization model leads to multistage and twostage stochastic programs, respectively. Both stochastic programming problems involve a large number of mixedinteger (stochastic) decisions, but their constraints are loosely coupled across operating power units. This is used to design Lagrangian relaxation methods for both models, which lead to a decomposition into stochastic single unit subproblems. For the dynamic model a Lagrangian decomposition based algorithm is described in more detail. Special emphasis is put on a discussion of the duality gap, the efficient solution of the multistage single unit subproblems and on solving the dual problem...
Warm Start of the PrimalDual Method Applied in the CuttingPlane Scheme
 in the Cutting Plane Scheme, Mathematical Programming
, 1997
"... A practical warmstart procedure is described for the infeasible primaldual interiorpoint method employed to solve the restricted master problem within the cuttingplane method. In contrast to the theoretical developments in this field, the approach presented in this paper does not make the unreal ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
A practical warmstart procedure is described for the infeasible primaldual interiorpoint method employed to solve the restricted master problem within the cuttingplane method. In contrast to the theoretical developments in this field, the approach presented in this paper does not make the unrealistic assumption that the new cuts are shallow. Moreover, it treats systematically the case when a large number of cuts are added at one time. The technique proposed in this paper has been implemented in the context of HOPDM, the state of the art, yet public domain, interiorpoint code. Numerical results confirm a high degree of efficiency of this approach: regardless of the number of cuts added at one time (can be thousands in the largest examples) and regardless of the depth of the new cuts, reoptimizations are usually done with a few additional iterations. Key words. Warm start, primaldual algorithm, cuttingplane methods. Supported by the Fonds National de la Recherche Scientifique Su...
Tighter and convex maximum margin clustering
 In AISTATS, 2009b
"... Maximum margin principle has been successfully applied to many supervised and semisupervised problems in machine learning. Recently, this principle was extended for clustering, referred to as Maximum Margin Clustering (MMC) and achieved promising performance in recent studies. To avoid the problem ..."
Abstract

Cited by 19 (10 self)
 Add to MetaCart
Maximum margin principle has been successfully applied to many supervised and semisupervised problems in machine learning. Recently, this principle was extended for clustering, referred to as Maximum Margin Clustering (MMC) and achieved promising performance in recent studies. To avoid the problem of local minima, MMC can be solved globally via convex semidefinite programming (SDP) relaxation. Although many efficient approaches have been proposed to alleviate the computational burden of SDP, convex MMCs are still not scalable for medium data sets. In this paper, we propose a novel convex optimization method, LGMMC, which maximizes the margin of opposite clusters via “Label Generation”. It can be shown that LGMMC is much more scalable than existing convex approaches. Moreover, we show that our convex relaxation is tighter than stateofart convex MMCs. Experiments on seventeen UCI datasets and MNIST dataset show significant improvement over existing MMC algorithms. 1