Results 1  10
of
60
ARPACK Users Guide: Solution of Large Scale Eigenvalue Problems by Implicitly Restarted Arnoldi Methods.
, 1997
"... this document is intended to provide a cursory overview of the Implicitly Restarted Arnoldi/Lanczos Method that this software is based upon. The goal is to provide some understanding of the underlying algorithm, expected behavior, additional references, and capabilities as well as limitations of the ..."
Abstract

Cited by 136 (14 self)
 Add to MetaCart
this document is intended to provide a cursory overview of the Implicitly Restarted Arnoldi/Lanczos Method that this software is based upon. The goal is to provide some understanding of the underlying algorithm, expected behavior, additional references, and capabilities as well as limitations of the software. 1.7 Dependence on LAPACK and BLAS
Krylov Projection Methods For Model Reduction
, 1997
"... This dissertation focuses on efficiently forming reducedorder models for large, linear dynamic systems. Projections onto unions of Krylov subspaces lead to a class of reducedorder models known as rational interpolants. The cornerstone of this dissertation is a collection of theory relating Krylov p ..."
Abstract

Cited by 119 (3 self)
 Add to MetaCart
This dissertation focuses on efficiently forming reducedorder models for large, linear dynamic systems. Projections onto unions of Krylov subspaces lead to a class of reducedorder models known as rational interpolants. The cornerstone of this dissertation is a collection of theory relating Krylov projection to rational interpolation. Based on this theoretical framework, three algorithms for model reduction are proposed. The first algorithm, dual rational Arnoldi, is a numerically reliable approach involving orthogonal projection matrices. The second, rational Lanczos, is an efficient generalization of existing Lanczosbased methods. The third, rational power Krylov, avoids orthogonalization and is suited for parallel or approximate computations. The performance of the three algorithms is compared via a combination of theory and examples. Independent of the precise algorithm, a host of supporting tools are also developed to form a complete modelreduction package. Techniques for choosing the matching frequencies, estimating the modeling error, insuring the model's stability, treating multipleinput multipleoutput systems, implementing parallelism, and avoiding a need for exact factors of large matrix pencils are all examined to various degrees.
A Shifted Block Lanczos Algorithm For Solving Sparse Symmetric Generalized Eigenproblems
, 1994
"... An "industrial strength" algorithm for solving sparse symmetric generalized eigenproblems is described. The algorithm has its foundations in known techniques in solving sparse symmetric eigenproblems, notably the spectral transformation of Ericsson and Ruhe and the block Lanczos algorithm. However, ..."
Abstract

Cited by 86 (7 self)
 Add to MetaCart
An "industrial strength" algorithm for solving sparse symmetric generalized eigenproblems is described. The algorithm has its foundations in known techniques in solving sparse symmetric eigenproblems, notably the spectral transformation of Ericsson and Ruhe and the block Lanczos algorithm. However, the combination of these two techniques is not trivial; there are many pitfalls awaiting the unwary implementor. The focus of this paper is on identifying those pitfalls and avoiding them, leading to a "bombproof" algorithm that can live as a black box eigensolver inside a large applications code. The code that results comprises a robust shift selection strategy and a block Lanczos algorithm that is a novel combination of new techniques and extensions of old techniques.
An Implicitly Restarted Lanczos Method for Large Symmetric. . .
 ETNA
, 1994
"... . The Lanczos process is a well known technique for computing a few, say k, eigenvalues and associated eigenvectors of a large symmetric nn matrix. However, loss of orthogonality of the computed Krylov subspace basis can reduce the accuracy of the computed approximate eigenvalues. In the implicitly ..."
Abstract

Cited by 54 (13 self)
 Add to MetaCart
. The Lanczos process is a well known technique for computing a few, say k, eigenvalues and associated eigenvectors of a large symmetric nn matrix. However, loss of orthogonality of the computed Krylov subspace basis can reduce the accuracy of the computed approximate eigenvalues. In the implicitly restarted Lanczos method studied in the present paper, this problem is addressed by fixing the number of steps in the Lanczos process at a prescribed value, k +p, where p typically is not much larger, and may be smaller, than k. Orthogonality of the k + p basis vectors of the Krylov subspace is secured by reorthogonalizing these vectors when necessary. The implicitly restarted Lanczos method exploits that the residual vector obtained by the Lanczos process is a function of the initial Lanczos vector. The method updates the initial Lanczos vector through an iterative scheme. The purpose of the iterative scheme is to determine an initial vector such that the associated residual vector is tiny....
A spectral algorithm for seriation and the consecutive ones problem
 SIAM Journal on Computing
, 1998
"... Abstract. In applications ranging from DNA sequencing through archeological dating to sparse matrix reordering, a recurrent problem is the sequencing of elements in such a way that highly correlated pairs of elements are near each other. That is, given a correlation function f reflecting the desire ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Abstract. In applications ranging from DNA sequencing through archeological dating to sparse matrix reordering, a recurrent problem is the sequencing of elements in such a way that highly correlated pairs of elements are near each other. That is, given a correlation function f reflecting the desire for each pair of elements to be near each other, find all permutations π with the property that if π(i) < π(j) < π(k) then f(i, j) ≥ f(i, k) and f(j, k) ≥ f(i, k). This seriation problem is a generalization of the wellstudied consecutive ones problem. We present a spectral algorithm for this problem that has a number of interesting features. Whereas most previous applications of spectral techniques provide only bounds or heuristics, our result is an algorithm that correctly solves a nontrivial combinatorial problem. In addition, spectral methods are being successfully applied as heuristics to a variety of sequencing problems, and our result helps explain and justify these applications.
Choosing regularization parameters in iterative methods for illposed problems
 SIAM J. MATRIX ANAL. APPL
, 2001
"... Numerical solution of illposedproblems is often accomplishedby discretization (projection onto a finite dimensional subspace) followed by regularization. If the discrete problem has high dimension, though, typically we compute an approximate solution by projecting the discrete problem onto an even ..."
Abstract

Cited by 32 (6 self)
 Add to MetaCart
Numerical solution of illposedproblems is often accomplishedby discretization (projection onto a finite dimensional subspace) followed by regularization. If the discrete problem has high dimension, though, typically we compute an approximate solution by projecting the discrete problem onto an even smaller dimensional space, via iterative methods based on Krylov subspaces. In this work we present a common framework for efficient algorithms that regularize after this second projection rather than before it. We show that determining regularization parameters based on the final projectedproblem rather than on the original discretization has firmer justification andoften involves less computational expense. We prove some results on the approximate equivalence of this approach to other forms of regularization, andwe present numerical examples.
ThickRestart Lanczos Method for Symmetric Eigenvalue Problems
 SIAM J. MATRIX ANAL. APPL
, 1998
"... For real symmetric eigenvalue problems, there are a number of algorithms that are mathematically equivalent, for example, the Lanczos algorithm, the Arnoldi method and the unpreconditioned Davidson method. The Lanczos algorithm is often preferred because it uses significantly fewer arithmetic ope ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
For real symmetric eigenvalue problems, there are a number of algorithms that are mathematically equivalent, for example, the Lanczos algorithm, the Arnoldi method and the unpreconditioned Davidson method. The Lanczos algorithm is often preferred because it uses significantly fewer arithmetic operations per iteration. To limit the maximum memory usage, these algorithms are often restarted. In recent years, a number of effective restarting schemes have been developed for the Arnoldi method and the Davidson method. This paper describes a simple restarting scheme for the Lanczos algorithm. This restarted Lanczos algorithm uses as many arithmetic operations as the original algorithm. Theoretically, this restarted Lanczos method is equivalent to the implicitly restarted Arnoldi method and the thickrestart Davidson method. Because it uses less arithmetic operations than the others, it is an attractive alternative for solving symmetric eigenvalue problems.
Efficient Convex Optimization For Engineering Design
, 1994
"... . Many problems in engineering analysis and design can be cast as convex optimization problems, often nonlinear and nondifferentiable. We give a highlevel description of recently developed interiorpoint methods for convex optimization, explain how problem structure can be exploited in these algori ..."
Abstract

Cited by 21 (13 self)
 Add to MetaCart
. Many problems in engineering analysis and design can be cast as convex optimization problems, often nonlinear and nondifferentiable. We give a highlevel description of recently developed interiorpoint methods for convex optimization, explain how problem structure can be exploited in these algorithms, and illustrate the general scheme with numerical experiments. To give a rough idea of the efficiencies obtained, we are able to solve convex optimization problems with over 1000 variables and 10000 constraints in around 10 minutes on a workstation. Keywords. Optimization, numerical methods, linear programming, optimal control, robust control, convex programming, interiorpoint methods, FIR filter design, conjugate gradients 1. INTRODUCTION Many problems in engineering analysis and design can be cast as convex optimization problems, i.e., min f 0 (x) s.t. f i (x) 0; i = 1; : : : ; L; where the functions f i are convex. It is widely known that such problems have desirable properties,...
Implicitly restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations
, 1996
"... Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new m ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of largescale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for largescale nonsymmetric problems was virtually nonexistent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of largescale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The wellknown Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.
Krylov Subspace Estimation
, 2000
"... . Computing the linear leastsquares estimate of a highdimensional random quantity given noisy data requires solving a large system of linear equations. In many situations, one can solve this system efficiently using a Krylov subspace method, such as the conjugate gradient (CG) algorithm. Computing ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
. Computing the linear leastsquares estimate of a highdimensional random quantity given noisy data requires solving a large system of linear equations. In many situations, one can solve this system efficiently using a Krylov subspace method, such as the conjugate gradient (CG) algorithm. Computing the estimation error variances is a more intricate task. It is difficult because the error variances are the diagonal elements of a matrix expression involving the inverse of a given matrix. This paper presents a method for using the conjugate search directions generated by the CG algorithm to obtain a convergent approximation to the estimation error variances. The algorithm for computing the error variances falls out naturally from a new estimationtheoretic interpretation of the CG algorithm. This paper discusses this interpretation and convergence issues and presents numerical examples. The examples include a 10 5 dimensional estimation problem from oceanography. Key words. Krylov sub...