Results 1  10
of
20
Robust Solutions To Uncertain Semidefinite Programs
 SIAM J. OPTIMIZATION
, 1998
"... In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value ..."
Abstract

Cited by 109 (8 self)
 Add to MetaCart
(Show Context)
In this paper we consider semidefinite programs (SDPs) whose data depend on some unknown but bounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible value of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist as SDPs. When the perturbation is "full," our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique and continuous (Hölderstable) with respect to the unperturbed problem's data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation, and integer programming.
Robust Solutions To Uncertain Semidefinite Programs
, 1998
"... In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values ..."
Abstract

Cited by 83 (3 self)
 Add to MetaCart
In this paper we consider semidenite programs (SDPs) whose data depends on some unknownbutbounded perturbation parameters. We seek "robust" solutions to such programs, that is, solutions which minimize the (worstcase) objective while satisfying the constraints for every possible values of parameters within the given bounds. Assuming the data matrices are rational functions of the perturbation parameters, we show how to formulate sufficient conditions for a robust solution to exist, as SDPs. When the perturbation is "full", our conditions are necessary and sufficient. In this case, we provide sufficient conditions which guarantee that the robust solution is unique, and continuous (Hölderstable) with respect to the unperturbed problems' data. The approach can thus be used to regularize illconditioned SDPs. We illustrate our results with examples taken from linear programming, maximum norm minimization, polynomial interpolation and integer programming.
Optimization Problems with perturbations, A guided tour
 SIAM REVIEW
, 1996
"... This paper presents an overview of some recent and significant progress in the theory of optimization with perturbations. We put the emphasis on methods based on upper and lower estimates of the value of the perturbed problems. These methods allow to compute expansions of the value function and app ..."
Abstract

Cited by 71 (10 self)
 Add to MetaCart
This paper presents an overview of some recent and significant progress in the theory of optimization with perturbations. We put the emphasis on methods based on upper and lower estimates of the value of the perturbed problems. These methods allow to compute expansions of the value function and approximate solutions in situations where the set of Lagrange multipliers may be unbounded, or even empty. We give rather complete results for nonlinear programming problems, and describe some partial extensions of the method to more general problems. We illustrate the results by computing the equilibrium position of a chain that is almost vertical or horizontal.
First and Second Order Analysis of Nonlinear Semidefinite Programs
 Mathematical Programming
, 1997
"... In this paper we study nonlinear semidefinite programming problems. Convexity, duality and firstorder optimality conditions for such problems are presented. A secondorder analysis is also given. Secondorder necessary and sufficient optimality conditions are derived. Finally, sensitivity analysi ..."
Abstract

Cited by 61 (10 self)
 Add to MetaCart
In this paper we study nonlinear semidefinite programming problems. Convexity, duality and firstorder optimality conditions for such problems are presented. A secondorder analysis is also given. Secondorder necessary and sufficient optimality conditions are derived. Finally, sensitivity analysis of such programs is discussed. Key words: Semidefinite programming, cone constraints, convex programming, duality, secondorder optimality conditions, tangent cones, optimal value function, sensitivity analysis. AMS subject classification: 90C25, 90C30, 90C31 1 Introduction In this paper we consider the following optimization problem (P ) min x2IR m f(x) subject to G(x) 0: Here G : IR m ! S n is a mapping from IR m into the space S n of n \Theta n symmetric matrices and, for A; B 2 S n , the notation A B (the notation A B) means that the matrix A \Gamma B is positive semidefinite (negative semidefinite). Consider the cone K ae S n of positive semidefinite matrices. Then the co...
Sensitivity Analysis of Optimization Problems Under Second Order Regular Constraints
, 1996
"... We present a perturbation theory for finite dimensional optimization problems subject to abstract constraints satisfying a second order regularity condition. We derive Lipschitz and Holder expansions of approximate optimal solutions, under a directional constraint qualification hypothesis and vari ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
We present a perturbation theory for finite dimensional optimization problems subject to abstract constraints satisfying a second order regularity condition. We derive Lipschitz and Holder expansions of approximate optimal solutions, under a directional constraint qualification hypothesis and various second order sufficient conditions that take into account the curvature of the set defining the constraints of the problem. We also show how the theory applies to semidefinite optimization and, more generally, to semiinfinite programs in which the contact set is a smooth manifold and the quadratic growth condition in the constraint space holds. As a final application we provide a result on differentiability of metric projections in finite dimensional spaces.
Quadratic Growth and Stability in Convex Programming Problems With Multiple Solutions
, 1995
"... Given a convex program with C² functions and a convex set S of solutions to the problem, we give a second order condition which guarantees that the problem does not have solutions outside of S. This condition is interpreted as a characterization for the quadratic growth of the cost function. The cr ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Given a convex program with C² functions and a convex set S of solutions to the problem, we give a second order condition which guarantees that the problem does not have solutions outside of S. This condition is interpreted as a characterization for the quadratic growth of the cost function. The crucial role in the proofs is played by a theorem describing a certain uniform regularity property of critical cones in smooth convex programs. We apply these results to the discussion of stability of solutions of a convex program under possibly nonconvex perturbations.
Amenable functions in optimization
 IN NONSMOOTH OPTIMIZATION METHODS AND APPLICATIONS
, 1992
"... ..."
On differentiability of symmetric matrix valued functions
"... With every real valued function, of a real argument, can be associated a matrix function mapping a linear space of symmetric matrices into itself. In this paper we study directional differentiability properties of such matrix functions associated with directionally differentiable real valued functio ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
With every real valued function, of a real argument, can be associated a matrix function mapping a linear space of symmetric matrices into itself. In this paper we study directional differentiability properties of such matrix functions associated with directionally differentiable real valued functions. In particular, we show that matrix valued functions inherit semismooth properties of the corresponding real valued functions. Key words: matrix function, eigenvalues and eigenvectors, directional derivatives, semismooth mappings
A note on sensitivity of value functions of mathematical programs with complementarity constraints.Math
 Program
"... Using standard nonlinear programming (NLP) theory, we establish formulas for first and second order directional derivatives for optimal value functions of parametric mathematical programs with complementarity constraints (MPCCs). The main point is that under a linear independence condition on the ac ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Using standard nonlinear programming (NLP) theory, we establish formulas for first and second order directional derivatives for optimal value functions of parametric mathematical programs with complementarity constraints (MPCCs). The main point is that under a linear independence condition on the active constraint gradients, optimal value sensitivity of MPCCs is essentially the same as for NLPs, in spite of the combinatorial nature of the MPCC feasible set. Unlike NLP however, second order directional derivatives of the MPCC optimal value function show combinatorial structure. 1
Nonsmooth Matrix Valued Functions Defined by Singular Values
, 2002
"... Abstract. A class of matrix valued functions defined by singular values of nonsymmetric matrices is shown to have many properties analogous to matrix valued functions defined by eigenvalues of symmetric matrices. In particular, the (smoothed) matrix valued FischerBurmeister function is proved to be ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
Abstract. A class of matrix valued functions defined by singular values of nonsymmetric matrices is shown to have many properties analogous to matrix valued functions defined by eigenvalues of symmetric matrices. In particular, the (smoothed) matrix valued FischerBurmeister function is proved to be strongly semismooth everywhere. This result is also used to show the strong semismoothness of the (smoothed) vector valued FischerBurmeister function associated with the second order cone. The strong semismoothness of singular values of a nonsymmetric matrix is discussed and used to analyze the quadratic convergence of Newton’s method for solving the inverse singular value problem. Keywords: FischerBurmeister function, SVD, strong semismoothness, inverse singular value problem