Results 1 
4 of
4
Derivatives of Spectral Functions
, 1996
"... A spectral function of a Hermitian matrix X is a function which depends only on the eigenvalues of X , 1 (X) 2 (X) : : : n (X), and hence may be written f( 1 (X); 2 (X); : : : ; n (X)) for some symmetric function f . Such functions appear in a wide variety of matrix optimization problems. We ..."
Abstract

Cited by 68 (13 self)
 Add to MetaCart
A spectral function of a Hermitian matrix X is a function which depends only on the eigenvalues of X , 1 (X) 2 (X) : : : n (X), and hence may be written f( 1 (X); 2 (X); : : : ; n (X)) for some symmetric function f . Such functions appear in a wide variety of matrix optimization problems. We give a simple proof that this spectral function is differentiable at X if and only if the function f is differentiable at the vector (X), and we give a concise formula for the derivative. We then apply this formula to deduce an analogous expression for the Clarke generalized gradient of the spectral function. A similar result holds for real symmetric matrices. 1 Introduction and notation Optimization problems involving a symmetric matrix variable, X say, frequently involve symmetric functions of the eigenvalues of X in the objective or constraints. Examples include the maximum eigenvalue of X, or log(det X) (for positive definite X), or eigenvalue constraints such as positive semidefinit...
Convex analysis on the Hermitian matrices
 SIAM Journal on Optimization
, 1996
"... There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions ..."
Abstract

Cited by 63 (19 self)
 Add to MetaCart
(Show Context)
There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions of the eigenvalues. A new approach to this characterization is given, via a simple Fenchel conjugacy formula. We then apply this formula to derive expressions for subdifferentials, and to study duality relationships for convex optimization problems with positive semidefinite matrices as variables. Analogous results hold for Hermitian matrices. Key Words: convexity, matrix function, Schur convexity, Fenchel duality, subdifferential, unitarily invariant, spectral function, positive semidefinite programming, quasiNewton update. AMS 1991 Subject Classification: Primary 15A45 49N15 Secondary 90C25 65K10 1 Introduction A matrix norm on the n \Theta n complex matrices is called unitarily inv...
Group Invariance and Convex Matrix Analysis
 SIAM J. Matrix Anal. Appl
, 1995
"... Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of t ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
(Show Context)
Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of the Euclidean norm, and functions on the Hermitian matrices (with trace inner product) which are invariant under unitary similarity transformations are just symmetric functions of the eigenvalues. We develop a framework for answering geometric and analytic (both classical and nonsmooth) questions about such a function by answering the corresponding question for the (much simpler) function appearing in the decomposition. The aim is to understand and extend the foundations of eigenvalue optimization, matrix approximation, and semidefinite programming. 2 1 Introduction Why is there such a strong parallel between, on the one hand, semidefinite programming and other eigenvalue optimizat...
A Unifying Investigation of InteriorPoint Methods for Convex Programming
 FACULTY OF MATHEMATICS AND INFORMATICS, TU DELFT, NL2628 BL
, 1992
"... In the recent past a number of papers were written that present low complexity interiorpoint methods for different classes of convex programs. Goal of this article is to show that the logarithmic barrier function associated with these programs is selfconcordant, and that the analyses of interiorp ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
In the recent past a number of papers were written that present low complexity interiorpoint methods for different classes of convex programs. Goal of this article is to show that the logarithmic barrier function associated with these programs is selfconcordant, and that the analyses of interiorpoint methods for these programs can thus be reduced to the analysis of interiorpoint methods with selfconcordant barrier functions.