Results 1  10
of
91
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 54 (2 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Optimization with stochastic dominance constraints
 SIAM Journal on Optimization
"... We consider the problem of constructing a portfolio of finitely many assets whose returns are described by a discrete joint distribution. We propose a new portfolio optimization model involving stochastic dominance constraints on the portfolio return. We develop optimality and duality theory for the ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
We consider the problem of constructing a portfolio of finitely many assets whose returns are described by a discrete joint distribution. We propose a new portfolio optimization model involving stochastic dominance constraints on the portfolio return. We develop optimality and duality theory for these models. We construct equivalent optimization models with utility functions. Numerical illustration is provided.
Solving Nonlinear Multicommodity Flow Problems By The Analytic Center Cutting Plane Method
, 1995
"... The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear prog ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear programming problems. Each subproblem consists of finding a minimum cost flow between an origin and a destination node in an uncapacited network. It is thus formulated as a shortest path problem and solved with the Dijkstra's dheap algorithm. An implementation is described that that takes full advantage of the supersparsity of the network in the linear algebra operations. Computational results show the efficiency of this approach on wellknown nondifferentiable problems and also large scale randomly generated problems (up to 1000 arcs and 5000 commodities). This research has been supported by the Fonds National de la Recherche Scientifique Suisse, grant #12 \Gamma 34002:92, NSERCCanada and ...
Large Margin Training for Hidden Markov Models with Partially Observed States TrinhMinhTri Do
"... Large margin learning of Continuous Density HMMs with a partially labeled dataset has been extensively studied in the speech and handwriting recognition fields. Yet due to the nonconvexity of the optimization problem, previous works usually rely on severe approximations so that it is still an open ..."
Abstract

Cited by 29 (4 self)
 Add to MetaCart
Large margin learning of Continuous Density HMMs with a partially labeled dataset has been extensively studied in the speech and handwriting recognition fields. Yet due to the nonconvexity of the optimization problem, previous works usually rely on severe approximations so that it is still an open problem. We propose a new learning algorithm that relies on nonconvex optimization and bundle methods and allows tackling the original optimization problem as is. It is proved to converge to a solution with accuracy ɛ with a rate O(1/ɛ). We provide experimental results gained on speech and handwriting recognition that demonstrate the potential of the method. 1.
Enlargement Of Monotone Operators With Applications To Variational Inequalities
"... Given a pointtoset operator T , we introduce the operator T " defined as T " (x) = fu : hu \Gamma v; x \Gamma yi 0 for all y 2 R n ; v 2 T (y)g. When T is maximal monotone T " inherits most properties of the "subdifferential, e.g. it is bounded on bounded sets, T &q ..."
Abstract

Cited by 26 (13 self)
 Add to MetaCart
Given a pointtoset operator T , we introduce the operator T " defined as T " (x) = fu : hu \Gamma v; x \Gamma yi 0 for all y 2 R n ; v 2 T (y)g. When T is maximal monotone T " inherits most properties of the "subdifferential, e.g. it is bounded on bounded sets, T " (x) contains the image through T of a sufficiently small ball around x, etc. We prove these and other relevant properties of T " , and apply it to generate an inexact proximal point method with generalized distances for variational inequalities, whose subproblems consists of solving problems of the form 0 2 H " (x), while the subproblems of the exact method are of the form 0 2 H(x). If " k is the coefficient used in the kth iteration and the " k 's are summable, then the sequence generated by the inexact algorithm is still convergent to a solution of the original problem. If the original operator is enough well behaved, then the solution set of each subproblem contains a ball around the exact solution, ...
Solving liftandproject relaxations of binary integer programs
 SIAM Journal on Optimization
"... Abstract. We propose a method for optimizing the liftandproject relaxations of binary integer programs introduced by Lovász and Schrijver. In particular, we study both linear and semidefinite relaxations. The key idea is a restructuring of the relaxations, which isolates the complicating constrain ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
Abstract. We propose a method for optimizing the liftandproject relaxations of binary integer programs introduced by Lovász and Schrijver. In particular, we study both linear and semidefinite relaxations. The key idea is a restructuring of the relaxations, which isolates the complicating constraints and allows for a Lagrangian approach. We detail an enhanced subgradient method and discuss its efficient implementation. Computational results illustrate that our algorithm produces tight bounds more quickly than stateoftheart linear and semidefinite solvers.
Two Numerical Methods for Optimizing Matrix Stability
 Linear Algebra Appl
, 2001
"... Consider the ane matrix family A(x) = A 0 + k=1 x k A k , mapping a design vector x 2 R into the space of n n real matrices. ..."
Abstract

Cited by 22 (8 self)
 Add to MetaCart
Consider the ane matrix family A(x) = A 0 + k=1 x k A k , mapping a design vector x 2 R into the space of n n real matrices.
Controller design via nonsmooth multidirectional search
 SIAM J. Control Optim
, 2006
"... z Abstract We propose an algorithm which combines multidirectional search (MDS) with nonsmooth optimization techniques to solve difficult problems in automatic control. Applications include static and fixedorder output feedback controller design, simultaneous stabilization, H2=H1 synthesis and muc ..."
Abstract

Cited by 20 (12 self)
 Add to MetaCart
z Abstract We propose an algorithm which combines multidirectional search (MDS) with nonsmooth optimization techniques to solve difficult problems in automatic control. Applications include static and fixedorder output feedback controller design, simultaneous stabilization, H2=H1 synthesis and much else. We show how to combine direct search techniques with nonsmooth descent steps in order to obtain convergence certificates in the presence of nonsmoothness. Our technique is the most efficient when small controllers for plants with large state dimension are sought. Our numerical testing includes several benchmark examples. For instance, our algorithm needs 0.41 seconds to compute a static output feedback stabilizing controller for the Boeing 767 flutter benchmark problem [22], a system with 55 states. The first static controller without performance specifications for this system was obtained in [16]. Keywords: N Phard design problems, static output feedback, fixedorder synthesis, simultaneous stabilization, mixed H2=H1synthesis, pattern search algorithm, moving polytope, nonsmooth analysis, spectral bundle method, "gradients, bilinear matrix inequality (BMI).