Results 1  10
of
52
Vector spaces of linearizations for matrix polynomials
 SIAM J. Matrix Anal. Appl
"... Abstract. The classical approach to investigating polynomial eigenvalue problems is linearization, where the polynomial is converted into a larger matrix pencil with the same eigenvalues. For any polynomial there are infinitely many linearizations with widely varying properties, but in practice th ..."
Abstract

Cited by 104 (23 self)
 Add to MetaCart
(Show Context)
Abstract. The classical approach to investigating polynomial eigenvalue problems is linearization, where the polynomial is converted into a larger matrix pencil with the same eigenvalues. For any polynomial there are infinitely many linearizations with widely varying properties, but in practice the companion forms are typically used. However, these companion forms are not always entirely satisfactory, and linearizations with special properties may sometimes be required. Given a matrix polynomial P, we develop a systematic approach to generating large classes of linearizations for P. We show how to simply construct two vector spaces of pencils that generalize the companion forms of P, and prove that almost all of these pencils are linearizations for P. Eigenvectors of these pencils are shown to be closely related to those of P. A distinguished subspace is then isolated, and the special properties of these pencils are investigated. These spaces of pencils provide a convenient arena in which to look for structured linearizations of structured polynomials, as well as to try to optimize the conditioning of linearizations.
Structured polynomial eigenvalue problems: Good vibrations from good linearizations
 SIAM J. Matrix Anal. Appl
"... Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of lineariz ..."
Abstract

Cited by 73 (24 self)
 Add to MetaCart
(Show Context)
Abstract. Many applications give rise to nonlinear eigenvalue problems with an underlying structured matrix polynomial. In this paper several useful classes of structured polynomial (e.g., palindromic, even, odd) are identified and the relationships between them explored. A special class of linearizations that reflect the structure of these polynomials, and therefore preserve symmetries in their spectra, is introduced and investigated. We analyze the existence and uniqueness of such linearizations, and show how they may be systematically constructed.
The conditioning of linearizations of matrix polynomials
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2005
"... Abstract. The standard way of solving the polynomial eigenvalue problem of degree m in n×n matrices is to “linearize ” to a pencil in mn×mn matrices and solve the generalized eigenvalue problem. For a given polynomial, P, infinitely many linearizations exist and they can have widely varying eigenva ..."
Abstract

Cited by 56 (21 self)
 Add to MetaCart
Abstract. The standard way of solving the polynomial eigenvalue problem of degree m in n×n matrices is to “linearize ” to a pencil in mn×mn matrices and solve the generalized eigenvalue problem. For a given polynomial, P, infinitely many linearizations exist and they can have widely varying eigenvalue condition numbers. We investigate the conditioning of linearizations from a vector space DL(P) of pencils recently identified and studied by Mackey, Mackey, Mehl, and Mehrmann. We look for the best conditioned linearization and compare the conditioning with that of the original polynomial. Two particular pencils are shown always to be almost optimal over linearizations in DL(P) for eigenvalues of modulus greater than or less than 1, respectively, provided that the problem is not too badly scaled and that the pencils are linearizations. Moreover, under this scaling assumption, these pencils are shown to be about as well conditioned as the original polynomial. For quadratic eigenvalue problems that are not too heavily damped, a simple scaling is shown to convert the problem to one that is well scaled. We also analyze the eigenvalue conditioning of the widely used first and second companion linearizations. The conditioning of the first companion linearization relative to that of P is shown to depend on the coefficient matrix norms, the eigenvalue, and the left eigenvectors of the linearization and of P. The companion form is found to be potentially much more
Backward error of polynomial eigenproblems solved by linearization
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2006
"... Abstract. The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, ..."
Abstract

Cited by 44 (11 self)
 Add to MetaCart
(Show Context)
Abstract. The most widely used approach for solving the polynomial eigenvalue problem P(λ)x = ��m i=0 λi � Ai x =0inn × n matrices Ai is to linearize to produce a larger order pencil L(λ) =λX + Y, whose eigensystem is then found by any method for generalized eigenproblems. For a given polynomial P, infinitely many linearizations L exist and approximate eigenpairs of P computed via linearization can have widely varying backward errors. We show that if a certain onesided factorization relating L to P can be found then a simple formula permits recovery of right eigenvectors of P from those of L, and the backward error of an approximate eigenpair of P can be bounded in terms of the backward error for the corresponding approximate eigenpair of L. A similar factorization has the same implications for left eigenvectors. We use this technique to derive backward error bounds depending only on the norms of the Ai for the companion pencils and for the vector space DL(P) of pencils recently identified by Mackey, Mackey, Mehl, and Mehrmann. In all cases, sufficient conditions are identified for an optimal backward error for P. These results are shown to be entirely consistent with those of Higham, Mackey, and Tisseur on the conditioning of linearizations of P. Other contributions of this work are a block scaling of the companion pencils
LINEARIZATION OF MATRIX POLYNOMIALS EXPRESSED IN POLYNOMIAL BASES
"... Companion matrices of matrix polynomials L(λ) (with possibly singular leading coefficient) are a familiar tool in matrix theory and numerical practice leading to socalled “linearizations ” λB − A of the polynomials. Matrix polynomials as approximations to more general matrix functions lead to the s ..."
Abstract

Cited by 34 (2 self)
 Add to MetaCart
Companion matrices of matrix polynomials L(λ) (with possibly singular leading coefficient) are a familiar tool in matrix theory and numerical practice leading to socalled “linearizations ” λB − A of the polynomials. Matrix polynomials as approximations to more general matrix functions lead to the study of matrix polynomials represented in a variety of classical systems of polynomials, including orthogonal systems and Lagrange polynomials, for example. For several such representations, it is shown how to construct (strong) linearizations via analogous companion matrix pencils. In case L(λ) has Hermitian or alternatively complex symmetric coefficients, the determination of linearizations λB −A with A and B Hermitian or complex symmetric is also discussed.
Linearizations of singular matrix polynomials and the recovery of minimal indices
"... Abstract. A standard way of dealing with a regular matrix polynomial P (λ) is to convert it into an equivalent matrix pencil – a process known as linearization. Two vector spaces of pencils L1(P) and L2(P) that generalize the first and second companion forms have recently been introduced by Mackey, ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
Abstract. A standard way of dealing with a regular matrix polynomial P (λ) is to convert it into an equivalent matrix pencil – a process known as linearization. Two vector spaces of pencils L1(P) and L2(P) that generalize the first and second companion forms have recently been introduced by Mackey, Mackey, Mehl and Mehrmann. Almost all of these pencils are linearizations for P (λ) when P is regular. The goal of this work is to show that most of the pencils in L1(P) and L2(P) are still linearizations when P (λ) is a singular square matrix polynomial, and that these linearizations can be used to obtain the complete eigenstructure of P (λ), comprised not only of the finite and infinite eigenvalues, but also for singular polynomials of the left and right minimal indices and minimal bases. We show explicitly how to recover the minimal indices and bases of the polynomial P (λ) from the minimal indices and bases of linearizations in L1(P) and L2(P). As a consequence of the recovery formulae for minimal indices, we prove that the vector space DL(P) = L1(P) ∩ L2(P) will never contain any linearization for a square singular polynomial P (λ). Finally, the results are extended to other linearizations of singular polynomials defined in terms of more general polynomial bases.
Definite matrix polynomials and their linearization by definite pencils
 Manchester Institute for Mathematical Sciences, The University of Manchester
, 2008
"... Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix po ..."
Abstract

Cited by 16 (8 self)
 Add to MetaCart
(Show Context)
Abstract. Hyperbolic matrix polynomials are an important class of Hermitian matrix polynomials that contain overdamped quadratics as a special case. They share with definite pencils the spectral property that their eigenvalues are real and semisimple. We extend the definition of hyperbolic matrix polynomial in a way that relaxes the requirement of definiteness of the leading coefficient matrix, yielding what we call definite polynomials. We show that this class of polynomials has an elegant characterization in terms of definiteness intervals on the extended real line, and that it includes definite pencils as a special case. A fundamental question is whether a definite matrix polynomial P can be linearized in a structurepreserving way. We show that the answer to this question is affirmative: P is definite if and only if it has a definite linearization in H(P), a certain vector space of Hermitian pencils; and for definite P we give a complete characterization of all the linearizations in H(P) that are definite. For the important special case of quadratics, we show how a definite quadratic polynomial can be transformed into a definite linearization with a positive definite leading coefficient matrix—a form that is particularly attractive numerically.
Trimmed linearizations for structured matrix polynomials
, 2008
"... Dedicated to Richard S. Varga on the occasion of his 80th birthday. We discuss the eigenvalue problem for general and structured matrix polynomials which may be singular and may have eigenvalues at infinity. We derive condensed forms that allow (partial) deflation of the infinite eigenvalue and sing ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
(Show Context)
Dedicated to Richard S. Varga on the occasion of his 80th birthday. We discuss the eigenvalue problem for general and structured matrix polynomials which may be singular and may have eigenvalues at infinity. We derive condensed forms that allow (partial) deflation of the infinite eigenvalue and singular structure of the matrix polynomial. The remaining reduced order staircase form leads to new types of linearizations which determine the finite eigenvalues and corresponding eigenvectors. The new linearizations also simplify the construction of structure preserving linearizations.
Solving rational eigenvalue problems via linearization
, 2008
"... Abstract. Rational eigenvalue problem is an emerging class of nonlinear eigenvalue problems arising from a variety of physical applications. In this paper, we propose a linearizationbased method to solve the rational eigenvalue problem. The proposed method converts the rational eigenvalue problem i ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Rational eigenvalue problem is an emerging class of nonlinear eigenvalue problems arising from a variety of physical applications. In this paper, we propose a linearizationbased method to solve the rational eigenvalue problem. The proposed method converts the rational eigenvalue problem into a wellstudied linear eigenvalue problem, and meanwhile, exploits and preserves the structure and properties of the original rational eigenvalue problem. For example, the lowrank property leads to a trimmed linearization. We show that solving a class of rational eigenvalue problems is just as convenient and efficient as solving linear eigenvalue problems. Key words. Rational eigenvalue problem, linearization, nonlinear eigenvalue problem AMS subject classifications. 65F15, 65F50, 15A18