Results 1  10
of
15
Efficient algorithms for solution of regularized total least squares
 Journal of Geodesy and Geodynamics
"... Abstract. Errorcontaminated systems Ax ≈ b, for which A is illconditioned, are considered. Such systems may be solved using Tikhonovlike regularized total least squares (RTLS) methods. Golub, Hansen, and O’Leary [SIAM J. Matrix Anal. Appl., 21 (1999), pp. 185–194] presented a parameterdependent ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Abstract. Errorcontaminated systems Ax ≈ b, for which A is illconditioned, are considered. Such systems may be solved using Tikhonovlike regularized total least squares (RTLS) methods. Golub, Hansen, and O’Leary [SIAM J. Matrix Anal. Appl., 21 (1999), pp. 185–194] presented a parameterdependent direct algorithm for the solution of the augmented Lagrange formulation for the RTLS problem, and Sima, Van Huffel, and Golub [Regularized Total Least Squares Based on
SkewHamiltonian and Hamiltonian eigenvalue problems: Theory, algorithms and applications
 Proceedings of ApplMath03, Brijuni (Croatia
"... SkewHamiltonian and Hamiltonian eigenvalue problems arise from a number of applications, particularly in systems and control theory. The preservation of the underlying matrix structures often plays an important role in these applications and may lead to more accurate and more efficient computation ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
SkewHamiltonian and Hamiltonian eigenvalue problems arise from a number of applications, particularly in systems and control theory. The preservation of the underlying matrix structures often plays an important role in these applications and may lead to more accurate and more efficient computational methods. We will discuss the relation of structured and unstructured condition numbers for these problems as well as algorithms exploiting the given matrix structures. Applications of Hamiltonian and skewHamiltonian eigenproblems are briefly described.
On the QR algorithm and updating the SVD and URV decomposition in Parallel
 Lin. Alg. Appl
, 1993
"... A Jacobitype updating algorithm for the SVD or URV decomposition is developed, which is related to the QR algorithm for the symmetric eigenvalue problem. The algorithm employs onesided transformations, and therefore provides a cheap alternative to earlier developed updating algorithms based on two ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
A Jacobitype updating algorithm for the SVD or URV decomposition is developed, which is related to the QR algorithm for the symmetric eigenvalue problem. The algorithm employs onesided transformations, and therefore provides a cheap alternative to earlier developed updating algorithms based on twosided transformations. The present algorithm as well as the corresponding systolic implementation is therefore roughly twice as fast, compared to the former method, while the tracking properties are preserved. The algorithm is also extended to the 2matrix QSVD or QURV case. Finally, the differences are discussed with a number of closely related algorithms that have recently been proposed. I. Introduction In an earlier report [16], an adaptive algorithm has been developed for tracking the singular value decomposition of a data matrix, when new observations (rows) are added continuously. The algorithm may be organized such that it provides at each time a certain approximation for the exact ...
The CSD, GSVD, their Applications and Computations
 University of Minnesota
, 1992
"... Since the CS decomposition (CSD) and the generalized singular value decomposition (GSVD) emerged as the generalization of the singular value decomposition about fifteen years ago, they have been proved to be very useful tools in numerical linear algebra. In this paper, we review the theoretical and ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Since the CS decomposition (CSD) and the generalized singular value decomposition (GSVD) emerged as the generalization of the singular value decomposition about fifteen years ago, they have been proved to be very useful tools in numerical linear algebra. In this paper, we review the theoretical and numerical development of the decompositions, discuss some of their applications and present some new results and observations. We also point out some open problems. A Fortran 77 code has been written that computes the CSD and the GSVD. Keywords: singular value decomposition, CS decomposition, generalized singular value decomposition. Subject Classifications: AMS(MOS): 65F30; CR:G1.3 1 Introduction The singular value decomposition (SVD) of a matrix is one of the most important tools in numerical linear algebra. It has been widely used in scientific computing. Recently, Stewart [52] gave an excellent survey on the early history of the SVD back to the contributions of E. Beltrami and C. Jord...
On Propagating Orthogonal Transformations in a Product of 2 x 2 Triangular Matrices
, 1993
"... In this note, we propose an implicit method for applying orthogonal transformations on both sides of a product of upper triangular 2 x 2 matrices that preserve upper triangularity of the factors. Such problems arise in Jacobi type methods for computing the PSVD of a product of several matrices, a ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this note, we propose an implicit method for applying orthogonal transformations on both sides of a product of upper triangular 2 x 2 matrices that preserve upper triangularity of the factors. Such problems arise in Jacobi type methods for computing the PSVD of a product of several matrices, and in ordering eigenvalues in the periodic Schur decomposition.
Analyses, Development, And Applications Of Tls Algorithms In Frequency Domain System Identification
, 1998
"... This paper gives an overview of frequency domain total least squares (TLS) estimators for rational transfer function models of linear timeinvariant multivariable systems. The statistical performance of the different approaches are analyzed through their equivalent cost functions. Both generalized a ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
This paper gives an overview of frequency domain total least squares (TLS) estimators for rational transfer function models of linear timeinvariant multivariable systems. The statistical performance of the different approaches are analyzed through their equivalent cost functions. Both generalized and bootstrapped total least squares (GTLS and BTLS) methods require the exact knowledge of the noise covariance matrix. The paper also studies the asymptotic (the number of data points going to infinity) behavior of the GTLS and BTLS estimators when the exact noise covariance matrix is replaced by the sample noise covariance matrix obtained from a (small) number of independent data sets. Even if only two independent repeated observations are available, it is shown that the estimates are still strongly consistent without any increase in the asymptotic uncertainty.
On a variational formulation of the generalized singular value decomposition
 SIAM J. Matrix Anal. Appl
, 1997
"... Abstract. Avariational formulation for the generalized singular value decomposition (GSVD) of a pair of matrices A 2 R m n and B 2 R p n is presented. In particular, a duality theory analogous to that of the SVD provides new understanding of left and right generalized singular vectors. It is shown t ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. Avariational formulation for the generalized singular value decomposition (GSVD) of a pair of matrices A 2 R m n and B 2 R p n is presented. In particular, a duality theory analogous to that of the SVD provides new understanding of left and right generalized singular vectors. It is shown that the intersection of row spaces of A and B playsakey role in the GSVD duality theory. The main result that characterizes left GSVD vectors involves a generalized singular value de ation process.
On the error analysis and implementation of some eigenvalue decomposition and singular value decomposition algorithms
, 1996
"... Many algorithms exist for computing the symmetric eigendecomposition, the singular value decomposition and the generalized singular value decomposition. In this thesis, we present several new algorithms and improvements on old algorithms, analyzing them with respect to their speed, accuracy, and sto ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Many algorithms exist for computing the symmetric eigendecomposition, the singular value decomposition and the generalized singular value decomposition. In this thesis, we present several new algorithms and improvements on old algorithms, analyzing them with respect to their speed, accuracy, and storage requirements. We rst discuss the variations on the bisection algorithm for nding eigenvalues of symmetric tridiagonal matrices. We show the challenges in implementing a correct algorithm with oating point arithmetic. We show how reasonable looking but incorrect implementations can fail. We carefully de ne correctness, and present several implementations that we rigorously prove correct. We then discuss a fast implementation of bisection using parallel pre x. We show many numerical examples of the instability of this algorithm, and then discuss its forward error and backward error analysis. We also discuss possible ways to stabilize it by using iterative re nement. Finally, we discuss how to use a divideandconquer algorithm to compute the singular value decomposition and solve the linear least squares problem, and how to implement
On the reduction of Tikhonov minimization problems and the construction of regularization matrices
 Numer. Algorithms
"... Abstract. Tikhonov regularization replaces a linear discrete illposed problem by a penalized leastsquares problem, whose solution is less sensitive to errors in the data and roundoff errors introduced during the solution process. The penalty term is defined by a regularization matrix and a regula ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. Tikhonov regularization replaces a linear discrete illposed problem by a penalized leastsquares problem, whose solution is less sensitive to errors in the data and roundoff errors introduced during the solution process. The penalty term is defined by a regularization matrix and a regularization parameter. The latter generally has to be determined during the solution process. This requires repeated solution of the penalized leastsquares problem. It is therefore attractive to transform the leastsquares problem to simpler form before solution. The present paper describes a transformation of the penalized leastsquares problem to simpler form that is faster to compute than available transformations in the situation when the regularization matrix has linearly dependent columns and no exploitable structure. Properties of this kind of regularization matrices are discussed and their performance is illustrated. Key words. illposed problem, Tikhonov regularization, regularization matrix, GSVD
Recent Developments in Dense Numerical Linear Algebra
, 1997
"... We survey recent developments in dense numerical linear algebra, covering linear systems, least squares problems and eigenproblems. Topics considered include the design and analysis of block, partitioned and parallel algorithms, condition number estimation, componentwise error analysis, and the comp ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We survey recent developments in dense numerical linear algebra, covering linear systems, least squares problems and eigenproblems. Topics considered include the design and analysis of block, partitioned and parallel algorithms, condition number estimation, componentwise error analysis, and the computation of practical error bounds. Frequent reference is made to LAPACK, the state of the art package of Fortran software designed to solve linear algebra problems efficiently and accurately on highperformance computers.