Results 1  10
of
10
The Mathematics Of Eigenvalue Optimization
, 2003
"... Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques and contemp ..."
Abstract

Cited by 92 (13 self)
 Add to MetaCart
Optimization problems involving the eigenvalues of symmetric and nonsymmetric matrices present a fascinating mathematical challenge. Such problems arise often in theory and practice, particularly in engineering design, and are amenable to a rich blend of classical mathematical techniques and contemporary optimization theory. This essay presents a personal choice of some central mathematical ideas, outlined for the broad optimization community. I discuss the convex analysis of spectral functions and invariant matrix norms, touching briey on semide nite representability, and then outlining two broader algebraic viewpoints based on hyperbolic polynomials and Lie algebra. Analogous nonconvex notions lead into eigenvalue perturbation theory. The last third of the article concerns stability, for polynomials, matrices, and associated dynamical systems, ending with a section on robustness. The powerful and elegant language of nonsmooth analysis appears throughout, as a unifying narrative thread.
Convex analysis on the Hermitian matrices
 SIAM Journal on Optimization
, 1996
"... There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions ..."
Abstract

Cited by 45 (20 self)
 Add to MetaCart
There is growing interest in optimization problems with real symmetric matrices as variables. Generally the matrix functions involved are spectral: they depend only on the eigenvalues of the matrix. It is known that convex spectral functions can be characterized exactly as symmetric convex functions of the eigenvalues. A new approach to this characterization is given, via a simple Fenchel conjugacy formula. We then apply this formula to derive expressions for subdifferentials, and to study duality relationships for convex optimization problems with positive semidefinite matrices as variables. Analogous results hold for Hermitian matrices. Key Words: convexity, matrix function, Schur convexity, Fenchel duality, subdifferential, unitarily invariant, spectral function, positive semidefinite programming, quasiNewton update. AMS 1991 Subject Classification: Primary 15A45 49N15 Secondary 90C25 65K10 1 Introduction A matrix norm on the n \Theta n complex matrices is called unitarily inv...
The Convex Analysis of Unitarily Invariant Matrix Functions
, 1995
"... this paper is to give a simple, selfcontained approach to this problem, giving back the subdifferential formula for (1.2) in [13] for example. Our idea will be to generalize von Neumann's result somewhat by asking which convex functions (rather than simply norms) are unitarily invariant: appropriat ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
this paper is to give a simple, selfcontained approach to this problem, giving back the subdifferential formula for (1.2) in [13] for example. Our idea will be to generalize von Neumann's result somewhat by asking which convex functions (rather than simply norms) are unitarily invariant: appropriately, the key idea will be a Fenchel conjugacy formula analogous to von Neumann's polarity formula (1.1): (f ffi oe)
Group Invariance and Convex Matrix Analysis
 SIAM J. Matrix Anal. Appl
, 1995
"... Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of t ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
Certain interesting classes of functions on a real inner product space are invariant under an associated group of orthogonal linear transformations. This invariance can be made explicit via a simple decomposition. For example, rotationally invariant functions on R 2 are just even functions of the Euclidean norm, and functions on the Hermitian matrices (with trace inner product) which are invariant under unitary similarity transformations are just symmetric functions of the eigenvalues. We develop a framework for answering geometric and analytic (both classical and nonsmooth) questions about such a function by answering the corresponding question for the (much simpler) function appearing in the decomposition. The aim is to understand and extend the foundations of eigenvalue optimization, matrix approximation, and semidefinite programming. 2 1 Introduction Why is there such a strong parallel between, on the one hand, semidefinite programming and other eigenvalue optimizat...
Convex analysis on Cartan subspaces
 Nonlinear Analysis, Theory, Methods and Applications
, 1998
"... The convex analysis of unitarily invariant matrix norms is important in matrix approximation. Analogously, the convex analysis of spectral functions of symmetric matrices is significant for eigenvalue optimization and semidefinite programming. We unify the two theories using the Kostant Convexity Th ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
The convex analysis of unitarily invariant matrix norms is important in matrix approximation. Analogously, the convex analysis of spectral functions of symmetric matrices is significant for eigenvalue optimization and semidefinite programming. We unify the two theories using the Kostant Convexity Theorem for semisimple Lie algebras. 1 Unitarily invariant norms and convex spectral functions In 1937, von Neumann [31] gave a famous characterization of unitarily invariant matrix norms (that is, norms f on C p\Thetaq satisfying f(uxv) = f(x) for all unitary matrices u and v and matrices x in C p\Thetaq ). His result states that such norms are those functions of the form g ffi oe, where the map x 2 C p\Thetaq 7! oe(x) 2 R p has components the singular values oe 1 (x) oe 2 (x) : : : oe p (x) of x (assuming p q) and g is a norm on R p , invariant under sign changes and permutations of components. Furthermore, he showed the respective dual norms satisfy (g ffi oe) D = g D ...
On Approximation Problems With ZeroTrace Matrices
, 1994
"... In this paper we consider some approximation problems in the linear space of complex matrices with respect to unitarily invariant norms. We deal with special cases of approximation of a matrix by zerotrace matrices. Moreover, some characterizations of zerotrace matrices are given by means of matri ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper we consider some approximation problems in the linear space of complex matrices with respect to unitarily invariant norms. We deal with special cases of approximation of a matrix by zerotrace matrices. Moreover, some characterizations of zerotrace matrices are given by means of matrix approximation problems.  1. INTRODUCTION Let A = [a ij ] 2 C n\Thetan be a complex matrix. The trace of A is equal to tr(A) = X j a jj : It is wellknown that tr(A) = 0 if and only if A is a commutator, that is, A = XY \Gamma Y X for some matrices X and Y . In this paper we consider some approximation problems, involving zerotrace matrices, with respect to an arbitrary unitarily invariant norm jj \Delta jj. A norm jj \Delta jj is unitarily invariant if jjUAjj = jjAV jj = jjAjj for all unitary matrices U and V . The most popular unitarily invariant norms are the c p ...
Design of a Class of Multirate Systems Using a Maximum Relative l²Error Criterion
, 1996
"... . A criterion for designing the class of multirate systems for rate#changing is presented. This criterion arises from a model#matching perspective with maximum relative # ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. A criterion for designing the class of multirate systems for rate#changing is presented. This criterion arises from a model#matching perspective with maximum relative #
SEMIDEFINITE PROGRAMMING*
"... Abstract. In sernidefinite programming, one minimizes a linear function subject to the constraint that an affine combination of synunetric matrices is positive semidefinite. Such a constraint is nonlinear and nonsmooth, but convex, so semidefinite programs are convex optimization problems. Semidefin ..."
Abstract
 Add to MetaCart
Abstract. In sernidefinite programming, one minimizes a linear function subject to the constraint that an affine combination of synunetric matrices is positive semidefinite. Such a constraint is nonlinear and nonsmooth, but convex, so semidefinite programs are convex optimization problems. Semidefinite programming unifies several standard problems (e.g., linear and quadratic programming) and finds many applications in engineering and combinatorial optimization. Although semidefinite programs are much more general than linear programs, they are not much harder to solve. Most interiorpoint methods for linear programming have been generalized to semidefinite programs. As in linear programming, these methods have polynomial worstcase complexity and perform very well in practice. This paper gives a survey of the theory and applications of semidefinite programs and an introduction to primaldual interiorpoint methods for their solution. Key words, semidefinite programming, convex optimization, interiorpoint methods, eigenvalue optimization, combinatorial optimization, system and control theory AMS subject classifications. 65K05, 49M45, 93B51, 90C25, 90C27, 90C90, 15A18 1. Introduction. 1.1. Semidefinite programming. We consider the problem of minimizing a linear function
www.elsevier.nl/locate/na Convex analysis on Cartan subspaces
, 1997
"... convexity theorem 1. Unitarily invariant norms and convex spectral functions In 1937, von Neumann [31] gave a famous characterization of unitarily invariant matrix norms (that is, norms f on Cp×q satisfying f(uxv) =f(x) for all unitary matrices u and v and matrices x in Cp×q). His result states that ..."
Abstract
 Add to MetaCart
convexity theorem 1. Unitarily invariant norms and convex spectral functions In 1937, von Neumann [31] gave a famous characterization of unitarily invariant matrix norms (that is, norms f on Cp×q satisfying f(uxv) =f(x) for all unitary matrices u and v and matrices x in Cp×q). His result states that such norms are those functions of the form g ◦ , where the map x ∈ C p×q ↦→ (x) ∈ R p has components the singular values 1(x) ≥ 2(x) ≥ ·· · ≥ p(x) of x (assuming p ≤ q) and g is a norm on R p, invariant under sign changes and permutations of components. Furthermore, he showed the respective dual norms satisfy (g ◦ ) D = g D ◦ (1.1) (where we regard C p×q as a Euclidean space with inner product 〈x; y 〉 =Retrx ∗ y for matrices x and y in C p×q).