Results 1 
4 of
4
An InteriorPoint Method for Semidefinite Programming
, 2005
"... We propose a new interior point based method to minimize a linear function of a matrix variable subject to linear equality and inequality constraints over the set of positive semidefinite matrices. We show that the approach is very efficient for graph bisection problems, such as maxcut. Other appli ..."
Abstract

Cited by 207 (17 self)
 Add to MetaCart
We propose a new interior point based method to minimize a linear function of a matrix variable subject to linear equality and inequality constraints over the set of positive semidefinite matrices. We show that the approach is very efficient for graph bisection problems, such as maxcut. Other applications include maxmin eigenvalue problems and relaxations for the stable set problem.
A New PrimalDual InteriorPoint Method for Semidefinite Programming
, 1994
"... Semidefinite programming (SDP) is a convex optimization problem in the space of symmetric matrices. Primaldual interiorpoint methods for SDP are discussed. These generate primal and dual matrices X and Z which commute only in the limit. A new method is proposed which iterates in the space of commu ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Semidefinite programming (SDP) is a convex optimization problem in the space of symmetric matrices. Primaldual interiorpoint methods for SDP are discussed. These generate primal and dual matrices X and Z which commute only in the limit. A new method is proposed which iterates in the space of commuting matrices. Let S! n\Thetan denote the set of real symmetric n \Theta n matrices. The standard inner product on this space is A ffl B = tr AB = P i;j a ij b ij : By X 0, where X 2 S! n\Thetan , we mean that X is positive semidefinite. Consider the semidefinite programming problem (SDP) min C ffl X (1) s:t: A i ffl X = b i i = 1; . . . ; m; X 0: (2) Here C and A i , i = 1; . . . ; m, are all fixed matrices in S! n\Thetan , and the unknown variable X also lies in S! n\Thetan . The semidefinite constraint on X is said to be nonsmooth, since it is equivalent to a bound constraint on the least eigenvalue of X , which is not a differentiable function of X . The constraint is, how...
Trust Regions and Relaxations for the Quadratic Assignment Problem
 In Quadratic assignment and related problems (New
, 1993
"... . General quadratic matrix minimization problems, with orthogonal constraints, arise in continuous relaxations for the (discrete) quadratic assignment problem (QAP). Currently, bounds for QAP are obtained by treating the quadratic and linear parts of the objective function, of the relaxations, separ ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
. General quadratic matrix minimization problems, with orthogonal constraints, arise in continuous relaxations for the (discrete) quadratic assignment problem (QAP). Currently, bounds for QAP are obtained by treating the quadratic and linear parts of the objective function, of the relaxations, separately. This paper handles general objectives as one function. The objectives can be both nonhomogeneous and nonconvex. The constraints are orthogonal or Loewner partial order (positive semidefinite) constraints. Comparisons are made to standard trust region subproblems. Numerical results are obtained using a parametric eigenvalue technique. Contents 1. Introduction 1 2. Preliminary Notations and Motivation 2 2.1. Notations 2.2. A Survey on Eigenvalue Bounds for the QAP 2.3. Loewner Partial Order 3. Optimality Conditions 6 3.1. First Order Conditions 3.2. Second Order Conditions 1991 Mathematics Subject Classification. Primary 90B80, 90C20, 90C35, 90C27; Secondary 65H20, 65K05. Key words...
ON EIGENVALUE OPTIMIZATION* ALEXANDER SHAPIRO
"... Abstract. In this paper we study optimization problems involving eigenvalues of symmetric matrices. One of the difficulties with numerical analysis of such problems is that the eigenvalues, considered as functions of a symmetric matrix, are not differentiable at those points where they coalesce. We ..."
Abstract
 Add to MetaCart
Abstract. In this paper we study optimization problems involving eigenvalues of symmetric matrices. One of the difficulties with numerical analysis of such problems is that the eigenvalues, considered as functions of a symmetric matrix, are not differentiable at those points where they coalesce. We present a general framework for a smooth (differentiable) approach to such problems. It is based on the concept of transversality borrowed from differential geometry. In that framework we discuss first and secondorder optimality conditions and rates of convergence of the corresponding secondorder algorithms. Finally we present some results on the sensitivity analysis of such problems. Key words, nonsmooth optimization, transversality condition, first and secondorder optimality conditions, Newtonâ€™s algorithm, quadratic rate of convergence, semiinfinite programming, sensitivity analysis AMS subject classifications.