## Optimizing Eigenvalues of Symmetric Definite Pencils (1994)

Venue: | in Proceedings of the 1994 American Control Conference |

Citations: | 7 - 0 self |

### BibTeX

@INPROCEEDINGS{Haeberly94optimizingeigenvalues,

author = {Jean-pierre A. Haeberly and Michael L. Overton},

title = {Optimizing Eigenvalues of Symmetric Definite Pencils},

booktitle = {in Proceedings of the 1994 American Control Conference},

year = {1994},

pages = {836--839}

}

### OpenURL

### Abstract

We consider the following quasiconvex optimization problem: minimize the largest eigenvalue of a symmetric definite matrix pencil depending on parameters. A new form of optimality conditions is given, emphasizing a complementarity condition on primal and dual matrices. Newton's method is then applied to these conditions to give a new quadratically convergent interior-point method which works well in practice. The algorithm is closely related to primaldual interior-point methods for semidefinite programming. 1. Introduction Many matrix inequality problems in control can be cast in the form: minimize the maximum eigenvalue of the Hermitian definite pencil (A(x); B(x)), w.r.t. a parameter vector x, subject to positive definite constraints on B(x) and sometimes also on other Hermitian matrix functions of x. The maximum eigenvalue is a quasiconvex function of the pencil elements and therefore of the parameter vector x if A, B depend affinely on x. This quasiconvexity reduces to convexity i...

### Citations

473 | Interior point methods in semidefinite programming with applications to combinatorial optimization
- Alizadeh
- 1995
(Show Context)
Citation Context ...problem of minimizing a linear function subject to semidefinite constraints on linear matrix families. A duality theory, completely analogous to the theory of linear programming (LP) is known for SDP =-=[1]-=-. In the general case as well as in the special case of SDP, the eigenvalues are generally not differentiable at the solution point. This is because the eigenvalues of a matrix or pencil fail to be di... |

341 |
Interior point polynomial algorithms in convex programming
- Nesterov, Nemirovskii
- 1994
(Show Context)
Citation Context ...ars it has been realized that the interior point methods which have been so successful for LP can be extended to solve eigenvalue optimization problems. For optimizing the eigenvalues of pencils, see =-=[3,4,5]. These methods are -=-based on Huard's method of centers and they consist of an "outer iteration ", each step of which requires the solution of a nonlinear problem using an "inner iteration". Usually Ne... |

119 |
Interior point methods for linear programming: Ready for production use
- MARSTEN, SHANNO
- 1990
(Show Context)
Citation Context ...it is argued that essentially any interior point method designed for LP can be extended to solve SDP. In LP it is now generally agreed that primal-dual interior-point methods are especially efficient =-=[7]-=-. A specific primal-dual method for SDP with a proof of global convergence was given by [8]. A related method was given by [9]. A different approach to primal-dual interior point methods for SDP is gi... |

84 | A primal-dual potential reduction method for problems involving matrix inequalities
- Vandenberghe, Boyd
- 1995
(Show Context)
Citation Context ...solve SDP. In LP it is now generally agreed that primal-dual interior-point methods are especially efficient [7]. A specific primal-dual method for SDP with a proof of global convergence was given by =-=[8]-=-. A related method was given by [9]. A different approach to primal-dual interior point methods for SDP is given in [10]. 4. Quadratically Convergent Local Methods In [2], the authors derived a quadra... |

78 | Large-scale optimization of eigenvalues
- Overton
- 1992
(Show Context)
Citation Context ...ergent Local Methods In [2], the authors derived a quadratically convergent local method for optimizing eigenvalues of pencils. This method extended earlier work on optimizing eigenvalues of matrices =-=[11,12]-=-. Note that each step of this algorithm requires only the solution of a linear system of equations, though the form of the equations is quite complicated. Even in the case of matrix eigenvalue optimiz... |

71 |
On minimizing the maximum eigenvalue of a symmetric matrix
- Overton
- 1988
(Show Context)
Citation Context ...ergent Local Methods In [2], the authors derived a quadratically convergent local method for optimizing eigenvalues of pencils. This method extended earlier work on optimizing eigenvalues of matrices =-=[11,12]-=-. Note that each step of this algorithm requires only the solution of a linear system of equations, though the form of the equations is quite complicated. Even in the case of matrix eigenvalue optimiz... |

26 |
Second derivatives for optimizing eigenvalues of symmetric matrices
- Overton, Womersley
- 1995
(Show Context)
Citation Context ...e solution of a linear system of equations, though the form of the equations is quite complicated. Even in the case of matrix eigenvalue optimization, the proof of quadratic convergence is nontrivial =-=[13]-=-, since the method cannot be described as the straightforward application of Newton's method to a nonlinear system. The method verifies optimality by explicitly computing the matrix V given in Theorem... |

10 | Max-min eigenvalue problems, primal-dual interior point algorithms, and trust region subproblems
- RENDL, VANDERBEI, et al.
- 1993
(Show Context)
Citation Context ...y agreed that primal-dual interior-point methods are especially efficient [7]. A specific primal-dual method for SDP with a proof of global convergence was given by [8]. A related method was given by =-=[9]-=-. A different approach to primal-dual interior point methods for SDP is given in [10]. 4. Quadratically Convergent Local Methods In [2], the authors derived a quadratically convergent local method for... |

8 |
Ghaoui, "Method of Centers for Minimizing Generalized Eigenvalues
- Boyd, El
- 1992
(Show Context)
Citation Context ...ars it has been realized that the interior point methods which have been so successful for LP can be extended to solve eigenvalue optimization problems. For optimizing the eigenvalues of pencils, see =-=[3,4,5]. These methods are -=-based on Huard's method of centers and they consist of an "outer iteration ", each step of which requires the solution of a nonlinear problem using an "inner iteration". Usually Ne... |

7 |
Overton,A hybrid algorithm for optimizing eigenvalues of symmetric definite pencils
- Haeberly, L
- 1994
(Show Context)
Citation Context ...(x)), i.e. solutions of det(B \Gamma A) = 0. Let Q be an n by n matrix of eigenvectors, i.e. satisfying AQ = Diag( i )BQ, with the normalization condition Q T BQ = I. The following result is given in =-=[2]-=-. Theorem 1. Assume that the multiplicity ofs1 =s1 (x) is known to be t, and let Q 1 be the n by t matrix whose columns are the corresponding t columns of the eigenvector matrix Q. Then a necessary co... |

6 |
On the superlinear and quadratic convergence of primal-dual interior point linear programming algorithms
- Zhang, Tapia, et al.
- 1992
(Show Context)
Citation Context ...iables to remain in the positive cone (LP) or positive semidefinite cone (SDP). The convergence rate of the resulting algorithm is very sensitive to the particular scheme that used in the line search =-=[10,14]-=-. We have found that the use of the full Newton step followed by a shift, if necessary, to be very efficient: quadratic convergence was achieved in every test problem. 5. Numerical Results The algorit... |

5 | An interior-point method for convex fractional programming
- Freund, Jarre
- 1993
(Show Context)
Citation Context ...ars it has been realized that the interior point methods which have been so successful for LP can be extended to solve eigenvalue optimization problems. For optimizing the eigenvalues of pencils, see =-=[3,4,5]. These methods are -=-based on Huard's method of centers and they consist of an "outer iteration ", each step of which requires the solution of a nonlinear problem using an "inner iteration". Usually Ne... |

4 |
An Interior PointMethod for Solving Linear Matrix Inequality Problems
- Fan, Nekooie
(Show Context)
Citation Context ...wton's method is used to solve this nonlinear problem, in which case each step of the inner iteration requires factoring a dense Hessian matrix whose order is m, the number of unknowns. The method of =-=[6]-=- is a modified method of centers for which the objective values of the outer iteration converge quadratically. Note that this method still requires a nonlinear problem to be solved by an inner iterati... |