Results 1  10
of
29
Robust Solutions To Uncertain Semidefinite Programs
 SIAM J. OPTIMIZATION
, 1998
"... In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value ..."
Abstract

Cited by 109 (8 self)
 Add to MetaCart
(Show Context)
In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist as SDPs. When the perturbation is "full," our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique and continuous (Hölderstable) with respect to the unperturbed problem's data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation, and integer programming.
Robust Solutions To Uncertain Semidefinite Programs
, 1998
"... In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values ..."
Abstract

Cited by 83 (3 self)
 Add to MetaCart
In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist, as SDPs. When the perturbation is "full", our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique, and continuous (Hölderstable) with respect to the unperturbed problems' data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation and integer programming.
Optimization Problems with perturbations, A guided tour
 SIAM REVIEW
, 1996
"... This paper presents an overview of some recent and significant progress in the theory of optimization with perturbations. We put the emphasis on methods based on upper and lower estimates of the value of the perturbed problems. These methods allow to compute expansions of the value function and app ..."
Abstract

Cited by 71 (10 self)
 Add to MetaCart
(Show Context)
This paper presents an overview of some recent and significant progress in the theory of optimization with perturbations. We put the emphasis on methods based on upper and lower estimates of the value of the perturbed problems. These methods allow to compute expansions of the value function and approximate solutions in situations where the set of Lagrange multipliers may be unbounded, or even empty. We give rather complete results for nonlinear programming problems, and describe some partial extensions of the method to more general problems. We illustrate the results by computing the equilibrium position of a chain that is almost vertical or horizontal.
Secondorder analysis for optimal control problems with pure and mixed state constraints. INRIA Research Report 6199
, 2007
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract

Cited by 36 (20 self)
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. appor t de r ech er ch e
Sensitivity Analysis of Optimization Problems Under Second Order Regular Constraints
, 1996
"... We present a perturbation theory for finite dimensional optimization problems subject to abstract constraints satisfying a second order regularity condition. We derive Lipschitz and Holder expansions of approximate optimal solutions, under a directional constraint qualification hypothesis and vari ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
We present a perturbation theory for finite dimensional optimization problems subject to abstract constraints satisfying a second order regularity condition. We derive Lipschitz and Holder expansions of approximate optimal solutions, under a directional constraint qualification hypothesis and various second order sufficient conditions that take into account the curvature of the set defining the constraints of the problem. We also show how the theory applies to semidefinite optimization and, more generally, to semiinfinite programs in which the contact set is a smooth manifold and the quadratic growth condition in the constraint space holds. As a final application we provide a result on differentiability of metric projections in finite dimensional spaces.
Stability and sensitivity analysis for optimal control problems with a firstorder state constraint and application to continuation methods
 ESAIM Control
"... This talk deals with stability and sensitivity analysis for optimal control problems of an ordinary differential equation with a firstorder state constraint. We consider the case when the Hamiltonian and the state constraint are regular. Malanowski (2) obtained Lipschitz continuity and direction ..."
Abstract

Cited by 17 (8 self)
 Add to MetaCart
(Show Context)
This talk deals with stability and sensitivity analysis for optimal control problems of an ordinary differential equation with a firstorder state constraint. We consider the case when the Hamiltonian and the state constraint are regular. Malanowski (2) obtained Lipschitz continuity and directional differentiability of solutions in L2, using generalized implicit functions theorems in infinite dimensional spaces, and without any assumptions on the structure of the trajectory. Malanowski and Maurer (3) proved that the solution and multipliers are C1 with respect to the parameter by application of the implicit function theorem to the shooting mapping, when there are finitely many junction times and strict comple
Strong Duality and Minimal Representations for Cone Optimization
, 2008
"... The elegant results for strong duality and strict complementarity for linear programming, LP, can fail for cone programming over nonpolyhedral cones. One can have: unattained optimal values; nonzero duality gaps; and no primaldual optimal pair that satisfies strict complementarity. This failure is ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
(Show Context)
The elegant results for strong duality and strict complementarity for linear programming, LP, can fail for cone programming over nonpolyhedral cones. One can have: unattained optimal values; nonzero duality gaps; and no primaldual optimal pair that satisfies strict complementarity. This failure is tied to the nonclosure of sums of nonpolyhedral closed cones. We take a fresh look at known and new results for duality, optimality, constraint qualifications, and strict complementarity, for linear cone optimization problems in finite dimensions. These results include: weakest and universal constraint qualifications, CQs; duality and characterizations of optimality that hold without any CQ; geometry of nice and devious cones; the geometric relationships between zero duality gaps, strict complementarity, and the facial structure of cones; and, the connection between theory and empirical evidence for lack of a CQand failure of strict complementarity. One theme is the notion of minimal representation of the cone and the constraints in order to regularize the problem and avoid both the theoretical and numerical difficulties that arise due to (near) loss of a CQ. We include a discussion on obtaining these representations efficiently.
Nogap secondorder optimality conditions for optimal control problems with a single state constraint and control
 Math. Program., 117(12, Ser. B):21–50, 2009. hal00697504, version 1  6
, 2012
"... The paper deals with optimal control problems with only one control variable and one state constraint, of arbitrary order. We consider the case of finitely many boundary arcs and touch times. We obtain a nogap theory of secondorder conditions, allowing to characterize secondorder quadratic growth ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
(Show Context)
The paper deals with optimal control problems with only one control variable and one state constraint, of arbitrary order. We consider the case of finitely many boundary arcs and touch times. We obtain a nogap theory of secondorder conditions, allowing to characterize secondorder quadratic growth.
INEXACT JOSEPHY–NEWTON FRAMEWORK FOR GENERERALIZED EQUATIONS AND ITS APPLICATIONS TO LOCAL ANALYSIS OF NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION ∗
, 2008
"... We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic progr ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
(Show Context)
We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic programming, and linearly constrained Lagrangian methods. For the linearly constrained Lagrangian methods, in particular, we obtain superlinear convergence under the secondorder sufficient optimality condition and the strict Mangasarian–Fromovitz constraint qualification, while previous results in the literature assume (in addition to secondorder sufficiency) the stronger linear independence constraint qualification as well as the strict complementarity condition. For the sequential quadratically constrained quadratic programming methods, we prove primaldual superlinear/quadratic convergence under the same assumptions as above, which also gives a new result.
Optimization methods and stability of inclusions in Banach spaces
 Math. Program
"... Abstract. Our paper deals with the interrelation of optimization methods and Lipschitz stability of multifunctions in arbitrary Banach spaces. Roughly speaking, we show that linear convergence of several first order methods and Lipschitz stability mean the same. Particularly, we characterize calmnes ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Our paper deals with the interrelation of optimization methods and Lipschitz stability of multifunctions in arbitrary Banach spaces. Roughly speaking, we show that linear convergence of several first order methods and Lipschitz stability mean the same. Particularly, we characterize calmness and the Aubin property by uniformly (with respect to certain starting points) linear convergence of descent methods and approximate projection methods. So we obtain, e.g., solution methods (for solving equations or variational problems) which require calmness only. The relations of these methods to several known basic algorithms are discussed, and errors in the subroutines as well as deformations of the given mappings are permitted. We also recall how such deformations are related to standard algorithms like barrier, penalty or regularization methods in optimization. Key words. Generalized equation, variational inequality, perturbation, regularization, stability criteria, metric regularity, calmness, approximate projections, penalization, successive approximation, Newton’s method.