Results 1  10
of
29
Structured polynomial eigenvalue problems: Good vibrations from good linearizations
 SIAM J. Matrix Anal. Appl
"... Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of lineariz ..."
Abstract

Cited by 37 (14 self)
 Add to MetaCart
Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of linearizations that reflect the structure of these polynomials, and therefore preserve symmetries in their spectra, is introduced and investigated. We analyze the existence and uniqueness of such linearizations, and show how they may be systematically constructed.
Symmetric linearizations for matrix polynomials
 SIAM J. MATRIX ANAL. APPL
, 2006
"... A standard way of treating the polynomial eigenvalue problem P(λ)x = 0 is to convert it into an equivalent matrix pencil—a process known as linearization. Two vector spaces of pencils L1(P) and L2(P), and their intersection DL(P), have recently been defined and studied by Mackey, Mackey, Mehl, and M ..."
Abstract

Cited by 31 (12 self)
 Add to MetaCart
A standard way of treating the polynomial eigenvalue problem P(λ)x = 0 is to convert it into an equivalent matrix pencil—a process known as linearization. Two vector spaces of pencils L1(P) and L2(P), and their intersection DL(P), have recently been defined and studied by Mackey, Mackey, Mehl, and Mehrmann. The aim of our work is to gain new insight into these spaces and the extent to which their constituent pencils inherit structure from P. For arbitrary polynomials we show that every pencil in DL(P) is block symmetric and we obtain a convenient basis for DL(P) built from block Hankel matrices. This basis is then exploited to prove that the first deg(P) pencils in a sequence constructed by Lancaster in the 1960s generate DL(P). When P is symmetric, we show that the symmetric pencils in L1(P) comprise DL(P), while for Hermitian P the Hermitian pencils in L1(P) form a proper subset of DL(P) that we explicitly characterize. Almost all pencils in each of these subsets are shown to be linearizations. In addition to obtaining new results, this work provides a selfcontained treatment of some of the key properties of DL(P) together with some new, more concise proofs.
Backward error of polynomial eigenproblems solved by linearization
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2006
"... Abstract. The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
Abstract. The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, infinitely many linearizations L exist and approximate eigenpairs of P computed via linearization can have widely varying backward errors. We show that if a certain onesided factorization relating L to P can be found then a simple formula permits recovery of right eigenvectors of P from those of L, and the backward error of an approximate eigenpair of P can be bounded in terms of the backward error for the corresponding approximate eigenpair of L. A similar factorization has the same implications for left eigenvectors. We use this technique to derive backward error bounds depending only on the norms of the Ai for the companion pencils and for the vector space DL(P) of pencils recently identified by Mackey, Mackey, Mehl, and Mehrmann. In all cases, sufficient conditions are identified for an optimal backward error for P. These results are shown to be entirely consistent with those of Higham, Mackey, and Tisseur on the conditioning of linearizations of P. Other contributions of this work are a block scaling of the companion pencils
Detecting and solving hyperbolic quadratic eigenvalue problems
, 2007
"... Reports available from: And by contacting: ..."
Scaling, sensitivity and stability in the numerical solution of quadratic eigenvalue problems
 Internat. J. Numer. Methods Eng
, 2006
"... The most common way of solving the quadratic eigenvalue problem (QEP) (λ 2 M +λD+K)x = 0 is to convert it into a linear problem (λX +Y)z = 0 of twice the dimension and solve the linear problem by the QZ algorithm or a Krylov method. In doing so, it is important to understand the influence of the lin ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
The most common way of solving the quadratic eigenvalue problem (QEP) (λ 2 M +λD+K)x = 0 is to convert it into a linear problem (λX +Y)z = 0 of twice the dimension and solve the linear problem by the QZ algorithm or a Krylov method. In doing so, it is important to understand the influence of the linearization process on the accuracy and stability of the computed solution. We discuss these issues for three particular linearizations: the standard companion linearization and two linearizations that preserve symmetry in the problem. For illustration we employ a model QEP describing the motion of a beam simply supported at both ends and damped at the midpoint. We show that the above linearizations lead to poor numerical results for the beam problem, but that a twoparameter scaling proposed by Fan, Lin and Van Dooren cures the instabilities. We also show that half of the eigenvalues of the beam QEP are pure imaginary and are eigenvalues of the undamped problem. Our analysis makes use of recently developed theory explaining the sensitivity and stability of linearizations, the main conclusions of which are summarized. As well as arguing that scaling should routinely be used, we give guidance on how to choose a linearization and illustrate the practical value of condition numbers and backward errors. key words: quadratic eigenvalue problem, sensitivity, condition number, backward error, stability,
Solving rational eigenvalue problems via linearization
, 2008
"... Abstract. Rational eigenvalue problem is an emerging class of nonlinear eigenvalue problems arising from a variety of physical applications. In this paper, we propose a linearizationbased method to solve the rational eigenvalue problem. The proposed method converts the rational eigenvalue problem i ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Abstract. Rational eigenvalue problem is an emerging class of nonlinear eigenvalue problems arising from a variety of physical applications. In this paper, we propose a linearizationbased method to solve the rational eigenvalue problem. The proposed method converts the rational eigenvalue problem into a wellstudied linear eigenvalue problem, and meanwhile, exploits and preserves the structure and properties of the original rational eigenvalue problem. For example, the lowrank property leads to a trimmed linearization. We show that solving a class of rational eigenvalue problems is just as convenient and efficient as solving linear eigenvalue problems. Key words. Rational eigenvalue problem, linearization, nonlinear eigenvalue problem AMS subject classifications. 65F15, 65F50, 15A18
Definite matrix polynomials and their linearization by definite pencils
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2008
"... Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix po ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix polynomial in a way that relaxes the requirement of definiteness of the leading coefficient matrix, yielding what we call definite polynomials. We show that this class of polynomials has an elegant characterization in terms of definiteness intervals on the extended real line, and that it includes definite pencils as a special case. A fundamental question is whether a definite matrix polynomial P can be linearized in a structurepreserving way. We show that the answer to this question is affirmative: P is definite if and only if it has a definite linearization in H(P), a certain vector space of Hermitian pencils; and for definite P we give a complete characterization of all the linearizations in H(P) that are definite. For the important special case of quadratics, we show how a definite quadratic polynomial can be transformed into a definite linearization with a positive definite leading coefficient matrix—a form that is particularly attractive numerically.
Structured Hölder condition numbers for multiple eigenvalues
, 2006
"... The sensitivity of a multiple eigenvalue of a matrix under perturbations can be measured by its Hölder condition number. Various extensions of this concept are considered. A meaningful notion of structured Hölder condition numbers is introduced and it is shown that many existing results on structure ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
The sensitivity of a multiple eigenvalue of a matrix under perturbations can be measured by its Hölder condition number. Various extensions of this concept are considered. A meaningful notion of structured Hölder condition numbers is introduced and it is shown that many existing results on structured condition numbers for simple eigenvalues carry over to multiple eigenvalues. The structures investigated in more detail include real, Toeplitz, Hankel, symmetric, skewsymmetric, Hamiltonian, and skewHamiltonian matrices. Furthermore, unstructured and structured Hölder condition numbers for multiple eigenvalues of matrix pencils are introduced. Particular attention is given to symmetric/skewsymmetric, Hermitian and palindromic pencils. It is also shown how matrix polynomial eigenvalue problems can be covered within this framework. 1