Results 1  10
of
44
The Mathematics Of Eigenvalue Optimization
, 2003
"... Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques and contemp ..."
Abstract

Cited by 92 (13 self)
 Add to MetaCart
Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques and contemporary optimization theory. This essay presents a personal choice of some central mathematical ideas, outlined for the broad optimization community. I discuss the convex analysis of spectral functions and invariant matrix norms, touching briey on semide nite representability, and then outlining two broader algebraic viewpoints based on hyperbolic polynomials and Lie algebra. Analogous nonconvex notions lead into eigenvalue perturbation theory. The last third of the article concerns stability, for polynomials, matrices, and associated dynamical systems, ending with a section on robustness. The powerful and elegant language of nonsmooth analysis appears throughout, as a unifying narrative thread.
Convex analysis on the Hermitian matrices
 SIAM Journal on Optimization
, 1996
"... There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions ..."
Abstract

Cited by 45 (20 self)
 Add to MetaCart
There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions of the eigenvalues. A new approach to this characterization is given, via a simple Fenchel conjugacy formula. We then apply this formula to derive expressions for subdifferentials, and to study duality relationships for convex optimization problems with positive semidefinite matrices as variables. Analogous results hold for Hermitian matrices. Key Words: convexity, matrix function, Schur convexity, Fenchel duality, subdifferential, unitarily invariant, spectral function, positive semidefinite programming, quasiNewton update. AMS 1991 Subject Classification: Primary 15A45 49N15 Secondary 90C25 65K10 1 Introduction A matrix norm on the n \Theta n complex matrices is called unitarily inv...
Legendre Functions and the Method of Random Bregman Projections
, 1997
"... this paper, Bregman's method is studied within the powerful framework of Convex Analysis. New insights are obtained and the rich class of "Bregman/Legendre functions" is introduced. Bregman's method still works, if the underlying function is Bregman/Legendre or more generally if it is Legendre but s ..."
Abstract

Cited by 44 (13 self)
 Add to MetaCart
this paper, Bregman's method is studied within the powerful framework of Convex Analysis. New insights are obtained and the rich class of "Bregman/Legendre functions" is introduced. Bregman's method still works, if the underlying function is Bregman/Legendre or more generally if it is Legendre but some constraint qualification holds additionally. The key advantage is the broad applicability and
Nonsmooth Analysis of Eigenvalues
 MATHEMATICAL PROGRAMMING
, 1998
"... The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier re ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
The eigenvalues of a symmetric matrix depend on the matrix nonsmoothly. This paper describes the nonsmooth analysis of these eigenvalues. In particular, I present a simple formula for the approximate (limiting Frechet) subdifferential of an arbitrary function of the eigenvalues, subsuming earlier results on convex and Clarke subgradients. As an example I compute the subdifferential of the k'th largest eigenvalue.
Twice Differentiable Spectral Functions
 SIAM J. Matrix Anal. Appl
, 2001
"... A function F on the space of nbyn real symmetric matrices is called spectral if it depends only on the eigenvalues of its argument. Spectral functions are just symmetric functions of the eigenvalues. We show that a spectral function is twice (continuously) dierentiable at a matrix if and only if t ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
A function F on the space of nbyn real symmetric matrices is called spectral if it depends only on the eigenvalues of its argument. Spectral functions are just symmetric functions of the eigenvalues. We show that a spectral function is twice (continuously) dierentiable at a matrix if and only if the corresponding symmetric function is twice (continuously) dierentiable at the vector of eigenvalues. We give a concise and usable formula for the Hessian. Keywords: spectral function, twice dierentiable, eigenvalue optimization, semidenite program, symmetric function, perturbation theory. 2000 Mathematics Subject Classication: 47A55, 15A18, 90C22 1 Introduction In this paper we are interested in functions F of a symmetric matrix argument that are invariant under orthogonal similarity transformations: F (U T AU) = F (A); for all orthogonal U and symmetric A : Department of Combinatorics & Optimization, University of Waterloo, Waterloo, Ontario N2L 3G1, Canada. Email: aslewis@...
Variational Analysis Of NonLipschitz Spectral Functions
 MATHEMATICAL PROGRAMMING
, 1999
"... We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f is \ma ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
We consider spectral functions f , where f is any permutationinvariant mapping from C n to R, and is the eigenvalue map from C nn to C n , ordering the eigenvalues lexicographically. For example, if f is the function \maximum real part", then f is the spectral abscissa, while if f is \maximum modulus", then f is the spectral radius. Both these spectral functions are continuous, but they are neither convex nor Lipschitz. For our analysis, we use the notion of subgradient extensively analyzed in Variational Analysis, R.T. Rockafellar and R. J.B. Wets (Springer, 1998), which is particularly well suited to the variational analysis of nonLipschitz spectral functions. We derive a number of necessary conditions for subgradients of spectral functions. For the spectral abscissa, we give both necessary and sucient conditions for subgradients, and precisely identify the case where subdierential regularity holds. We conclude by introducing the notion of semistable programmin...
Group Invariance and Convex Matrix Analysis
 SIAM J. Matrix Anal. Appl
, 1995
"... Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of t ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of the Euclidean norm, and functions on the Hermitian matrices (with trace inner product) which are invariant under unitary similarity transformations are just symmetric functions of the eigenvalues. We develop a framework for answering geometric and analytic (both classical and nonsmooth) questions about such a function by answering the corresponding question for the (much simpler) function appearing in the decomposition. The aim is to understand and extend the foundations of eigenvalue optimization, matrix approximation, and semidefinite programming. 2 1 Introduction Why is there such a strong parallel between, on the one hand, semidefinite programming and other eigenvalue optimizat...
A Survey of Subdifferential Calculus with Applications
 TMA
, 1998
"... This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools. ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
This survey is an account of the current status of subdifferential research. It is intended to serve as an entry point for researchers and graduate students in a wide variety of pure and applied analysis areas who might profitably use subdifferentials as tools.
A spectral quadraticSDP method with applications to fixedorder H2 and H∞ synthesis. Asian Control Conference
, 2004
"... In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H ∞ design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function the probl ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
In this paper, we discuss a spectral quadraticSDP method for the iterative resolution of fixedorder H2 and H ∞ design problems. These problems can be cast as regular SDP programs with additional nonlinear equality constraints. When the inequalities are absorbed into a Lagrangian function the problem reduces to solving a sequence of SDPs with quadratic objective function for which a spectral SDP method has been developed. Along with a description of the spectral SDP method used to solve the tangent subproblems, we report a number of computational results for validation purposes.
LOWRANK OPTIMIZATION ON THE CONE OF POSITIVE SEMIDEFINITE MATRICES ∗
"... Abstract. We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = YYT, where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Abstract. We propose an algorithm for solving optimization problems defined on a subset of the cone of symmetric positive semidefinite matrices. This algorithm relies on the factorization X = YYT, where the number of columns of Y fixes an upper bound on the rank of the positive semidefinite matrix X. It is thus very effective for solving problems that have a lowrank solution. The factorization X = YYT leads to a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a secondorder optimization method with guaranteed quadratic convergence. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. In contrast to existing methods, the proposed algorithm converges monotonically to the sought solution. Its numerical efficiency is evaluated on two applications: the maximal cut of a graph and the problem of sparse principal component analysis. Key words. lowrank constraints, cone of symmetric positive definite matrices, Riemannian quotient manifold, sparse principal component analysis, maximumcut algorithms, largescale algorithms